[DEPRECATION WARNING]: ANSIBLE_COLLECTIONS_PATHS option, does not fit var naming standard, use the singular form ANSIBLE_COLLECTIONS_PATH instead. This feature will be removed from ansible-core in version 2.19. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. 28983 1726882969.52912: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-4FB executable location = /usr/local/bin/ansible-playbook python version = 3.12.5 (main, Aug 7 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 28983 1726882969.53494: Added group all to inventory 28983 1726882969.53497: Added group ungrouped to inventory 28983 1726882969.53502: Group all now contains ungrouped 28983 1726882969.53505: Examining possible inventory source: /tmp/network-lQx/inventory.yml 28983 1726882969.79694: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 28983 1726882969.79781: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 28983 1726882969.79810: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 28983 1726882969.79896: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 28983 1726882969.80001: Loaded config def from plugin (inventory/script) 28983 1726882969.80004: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 28983 1726882969.80057: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 28983 1726882969.80186: Loaded config def from plugin (inventory/yaml) 28983 1726882969.80189: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 28983 1726882969.80306: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 28983 1726882969.80911: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 28983 1726882969.80915: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 28983 1726882969.80919: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 28983 1726882969.80926: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 28983 1726882969.80931: Loading data from /tmp/network-lQx/inventory.yml 28983 1726882969.81031: /tmp/network-lQx/inventory.yml was not parsable by auto 28983 1726882969.81121: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 28983 1726882969.81174: Loading data from /tmp/network-lQx/inventory.yml 28983 1726882969.81290: group all already in inventory 28983 1726882969.81299: set inventory_file for managed_node1 28983 1726882969.81303: set inventory_dir for managed_node1 28983 1726882969.81305: Added host managed_node1 to inventory 28983 1726882969.81307: Added host managed_node1 to group all 28983 1726882969.81309: set ansible_host for managed_node1 28983 1726882969.81310: set ansible_ssh_extra_args for managed_node1 28983 1726882969.81313: set inventory_file for managed_node2 28983 1726882969.81317: set inventory_dir for managed_node2 28983 1726882969.81318: Added host managed_node2 to inventory 28983 1726882969.81320: Added host managed_node2 to group all 28983 1726882969.81321: set ansible_host for managed_node2 28983 1726882969.81322: set ansible_ssh_extra_args for managed_node2 28983 1726882969.81325: set inventory_file for managed_node3 28983 1726882969.81328: set inventory_dir for managed_node3 28983 1726882969.81329: Added host managed_node3 to inventory 28983 1726882969.81331: Added host managed_node3 to group all 28983 1726882969.81332: set ansible_host for managed_node3 28983 1726882969.81333: set ansible_ssh_extra_args for managed_node3 28983 1726882969.81337: Reconcile groups and hosts in inventory. 28983 1726882969.81342: Group ungrouped now contains managed_node1 28983 1726882969.81345: Group ungrouped now contains managed_node2 28983 1726882969.81347: Group ungrouped now contains managed_node3 28983 1726882969.81447: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 28983 1726882969.81622: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 28983 1726882969.81693: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 28983 1726882969.81737: Loaded config def from plugin (vars/host_group_vars) 28983 1726882969.81740: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 28983 1726882969.81748: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 28983 1726882969.81758: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 28983 1726882969.81818: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 28983 1726882969.82220: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882969.82344: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 28983 1726882969.82406: Loaded config def from plugin (connection/local) 28983 1726882969.82410: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 28983 1726882969.83367: Loaded config def from plugin (connection/paramiko_ssh) 28983 1726882969.83371: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 28983 1726882969.84587: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 28983 1726882969.84643: Loaded config def from plugin (connection/psrp) 28983 1726882969.84652: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 28983 1726882969.85759: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 28983 1726882969.85819: Loaded config def from plugin (connection/ssh) 28983 1726882969.85822: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 28983 1726882969.88489: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 28983 1726882969.88550: Loaded config def from plugin (connection/winrm) 28983 1726882969.88554: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 28983 1726882969.88595: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 28983 1726882969.88681: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 28983 1726882969.88789: Loaded config def from plugin (shell/cmd) 28983 1726882969.88791: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 28983 1726882969.88824: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 28983 1726882969.88943: Loaded config def from plugin (shell/powershell) 28983 1726882969.88946: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 28983 1726882969.89025: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 28983 1726882969.89339: Loaded config def from plugin (shell/sh) 28983 1726882969.89342: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 28983 1726882969.89390: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 28983 1726882969.89604: Loaded config def from plugin (become/runas) 28983 1726882969.89607: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 28983 1726882969.89890: Loaded config def from plugin (become/su) 28983 1726882969.89892: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 28983 1726882969.90109: Loaded config def from plugin (become/sudo) 28983 1726882969.90112: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 28983 1726882969.90155: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tests_states_nm.yml 28983 1726882969.90599: in VariableManager get_vars() 28983 1726882969.90633: done with get_vars() 28983 1726882969.90808: trying /usr/local/lib/python3.12/site-packages/ansible/modules 28983 1726882969.94651: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 28983 1726882969.94821: in VariableManager get_vars() 28983 1726882969.94827: done with get_vars() 28983 1726882969.94830: variable 'playbook_dir' from source: magic vars 28983 1726882969.94831: variable 'ansible_playbook_python' from source: magic vars 28983 1726882969.94833: variable 'ansible_config_file' from source: magic vars 28983 1726882969.94835: variable 'groups' from source: magic vars 28983 1726882969.94836: variable 'omit' from source: magic vars 28983 1726882969.94837: variable 'ansible_version' from source: magic vars 28983 1726882969.94838: variable 'ansible_check_mode' from source: magic vars 28983 1726882969.94839: variable 'ansible_diff_mode' from source: magic vars 28983 1726882969.94840: variable 'ansible_forks' from source: magic vars 28983 1726882969.94841: variable 'ansible_inventory_sources' from source: magic vars 28983 1726882969.94842: variable 'ansible_skip_tags' from source: magic vars 28983 1726882969.94843: variable 'ansible_limit' from source: magic vars 28983 1726882969.94844: variable 'ansible_run_tags' from source: magic vars 28983 1726882969.94845: variable 'ansible_verbosity' from source: magic vars 28983 1726882969.94899: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_states.yml 28983 1726882969.95915: in VariableManager get_vars() 28983 1726882969.95938: done with get_vars() 28983 1726882969.96011: in VariableManager get_vars() 28983 1726882969.96028: done with get_vars() 28983 1726882969.96101: in VariableManager get_vars() 28983 1726882969.96118: done with get_vars() 28983 1726882969.96185: in VariableManager get_vars() 28983 1726882969.96207: done with get_vars() 28983 1726882969.96274: in VariableManager get_vars() 28983 1726882969.96292: done with get_vars() 28983 1726882969.96364: in VariableManager get_vars() 28983 1726882969.96383: done with get_vars() 28983 1726882969.96457: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 28983 1726882969.96476: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 28983 1726882969.96776: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 28983 1726882969.97036: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 28983 1726882969.97039: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-4FB/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) 28983 1726882969.97087: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 28983 1726882969.97120: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 28983 1726882969.97372: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 28983 1726882969.97507: Loaded config def from plugin (callback/default) 28983 1726882969.97511: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 28983 1726882969.99855: Loaded config def from plugin (callback/junit) 28983 1726882969.99858: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 28983 1726882969.99917: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 28983 1726882970.00020: Loaded config def from plugin (callback/minimal) 28983 1726882970.00022: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 28983 1726882970.00077: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 28983 1726882970.00162: Loaded config def from plugin (callback/tree) 28983 1726882970.00170: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 28983 1726882970.00337: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 28983 1726882970.00340: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-4FB/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_states_nm.yml ************************************************** 2 plays in /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tests_states_nm.yml 28983 1726882970.00376: in VariableManager get_vars() 28983 1726882970.00399: done with get_vars() 28983 1726882970.00407: in VariableManager get_vars() 28983 1726882970.00418: done with get_vars() 28983 1726882970.00424: variable 'omit' from source: magic vars 28983 1726882970.00530: in VariableManager get_vars() 28983 1726882970.00551: done with get_vars() 28983 1726882970.00577: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_states.yml' with nm as provider] *********** 28983 1726882970.05632: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 28983 1726882970.05974: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 28983 1726882970.06243: getting the remaining hosts for this loop 28983 1726882970.06245: done getting the remaining hosts for this loop 28983 1726882970.06249: getting the next task for host managed_node2 28983 1726882970.06253: done getting next task for host managed_node2 28983 1726882970.06256: ^ task is: TASK: Gathering Facts 28983 1726882970.06258: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882970.06260: getting variables 28983 1726882970.06262: in VariableManager get_vars() 28983 1726882970.06276: Calling all_inventory to load vars for managed_node2 28983 1726882970.06281: Calling groups_inventory to load vars for managed_node2 28983 1726882970.06285: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882970.06300: Calling all_plugins_play to load vars for managed_node2 28983 1726882970.06315: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882970.06320: Calling groups_plugins_play to load vars for managed_node2 28983 1726882970.06368: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882970.06645: done with get_vars() 28983 1726882970.06654: done getting variables 28983 1726882970.06738: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tests_states_nm.yml:6 Friday 20 September 2024 21:42:50 -0400 (0:00:00.065) 0:00:00.065 ****** 28983 1726882970.06764: entering _queue_task() for managed_node2/gather_facts 28983 1726882970.06766: Creating lock for gather_facts 28983 1726882970.07444: worker is 1 (out of 1 available) 28983 1726882970.07455: exiting _queue_task() for managed_node2/gather_facts 28983 1726882970.07470: done queuing things up, now waiting for results queue to drain 28983 1726882970.07472: waiting for pending results... 28983 1726882970.08098: running TaskExecutor() for managed_node2/TASK: Gathering Facts 28983 1726882970.08443: in run() - task 0affe814-3a2d-b16d-c0a7-00000000001b 28983 1726882970.08448: variable 'ansible_search_path' from source: unknown 28983 1726882970.08553: calling self._execute() 28983 1726882970.08612: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882970.08672: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882970.08789: variable 'omit' from source: magic vars 28983 1726882970.09340: variable 'omit' from source: magic vars 28983 1726882970.09343: variable 'omit' from source: magic vars 28983 1726882970.09345: variable 'omit' from source: magic vars 28983 1726882970.09393: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726882970.09441: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726882970.09473: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726882970.09504: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882970.09522: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882970.09590: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726882970.09675: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882970.09686: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882970.09923: Set connection var ansible_connection to ssh 28983 1726882970.09946: Set connection var ansible_shell_executable to /bin/sh 28983 1726882970.09964: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726882970.10011: Set connection var ansible_timeout to 10 28983 1726882970.10024: Set connection var ansible_pipelining to False 28983 1726882970.10111: Set connection var ansible_shell_type to sh 28983 1726882970.10144: variable 'ansible_shell_executable' from source: unknown 28983 1726882970.10153: variable 'ansible_connection' from source: unknown 28983 1726882970.10161: variable 'ansible_module_compression' from source: unknown 28983 1726882970.10169: variable 'ansible_shell_type' from source: unknown 28983 1726882970.10181: variable 'ansible_shell_executable' from source: unknown 28983 1726882970.10191: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882970.10201: variable 'ansible_pipelining' from source: unknown 28983 1726882970.10431: variable 'ansible_timeout' from source: unknown 28983 1726882970.10436: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882970.10743: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (found_in_cache=True, class_only=False) 28983 1726882970.10763: variable 'omit' from source: magic vars 28983 1726882970.10772: starting attempt loop 28983 1726882970.10782: running the handler 28983 1726882970.10801: variable 'ansible_facts' from source: unknown 28983 1726882970.10825: _low_level_execute_command(): starting 28983 1726882970.10839: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726882970.12456: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882970.12501: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726882970.12632: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726882970.12756: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882970.12866: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882970.14647: stdout chunk (state=3): >>>/root <<< 28983 1726882970.14840: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882970.14844: stdout chunk (state=3): >>><<< 28983 1726882970.14846: stderr chunk (state=3): >>><<< 28983 1726882970.14866: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726882970.14883: _low_level_execute_command(): starting 28983 1726882970.14892: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882970.1486735-29001-50054265961443 `" && echo ansible-tmp-1726882970.1486735-29001-50054265961443="` echo /root/.ansible/tmp/ansible-tmp-1726882970.1486735-29001-50054265961443 `" ) && sleep 0' 28983 1726882970.16356: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726882970.16370: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882970.16389: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882970.16601: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882970.16694: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882970.18761: stdout chunk (state=3): >>>ansible-tmp-1726882970.1486735-29001-50054265961443=/root/.ansible/tmp/ansible-tmp-1726882970.1486735-29001-50054265961443 <<< 28983 1726882970.19240: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882970.19243: stdout chunk (state=3): >>><<< 28983 1726882970.19247: stderr chunk (state=3): >>><<< 28983 1726882970.19249: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882970.1486735-29001-50054265961443=/root/.ansible/tmp/ansible-tmp-1726882970.1486735-29001-50054265961443 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726882970.19251: variable 'ansible_module_compression' from source: unknown 28983 1726882970.19254: ANSIBALLZ: Using generic lock for ansible.legacy.setup 28983 1726882970.19256: ANSIBALLZ: Acquiring lock 28983 1726882970.19257: ANSIBALLZ: Lock acquired: 140284034522080 28983 1726882970.19259: ANSIBALLZ: Creating module 28983 1726882970.54583: ANSIBALLZ: Writing module into payload 28983 1726882970.54701: ANSIBALLZ: Writing module 28983 1726882970.54723: ANSIBALLZ: Renaming module 28983 1726882970.54729: ANSIBALLZ: Done creating module 28983 1726882970.54761: variable 'ansible_facts' from source: unknown 28983 1726882970.54768: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726882970.54776: _low_level_execute_command(): starting 28983 1726882970.54787: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 28983 1726882970.55222: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882970.55226: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882970.55231: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882970.55233: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882970.55285: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726882970.55290: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882970.55379: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882970.57184: stdout chunk (state=3): >>>PLATFORM <<< 28983 1726882970.57263: stdout chunk (state=3): >>>Linux <<< 28983 1726882970.57283: stdout chunk (state=3): >>>FOUND /usr/bin/python3.12 <<< 28983 1726882970.57298: stdout chunk (state=3): >>>/usr/bin/python3 /usr/bin/python3 ENDFOUND <<< 28983 1726882970.57441: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882970.57486: stderr chunk (state=3): >>><<< 28983 1726882970.57489: stdout chunk (state=3): >>><<< 28983 1726882970.57504: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726882970.57515 [managed_node2]: found interpreters: ['/usr/bin/python3.12', '/usr/bin/python3', '/usr/bin/python3'] 28983 1726882970.57557: _low_level_execute_command(): starting 28983 1726882970.57560: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 && sleep 0' 28983 1726882970.57645: Sending initial data 28983 1726882970.57649: Sent initial data (1181 bytes) 28983 1726882970.57984: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882970.57987: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882970.57990: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28983 1726882970.57992: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found <<< 28983 1726882970.57994: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882970.58054: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726882970.58056: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882970.58128: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882970.61830: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"Fedora Linux\"\nVERSION=\"39 (Thirty Nine)\"\nID=fedora\nVERSION_ID=39\nVERSION_CODENAME=\"\"\nPLATFORM_ID=\"platform:f39\"\nPRETTY_NAME=\"Fedora Linux 39 (Thirty Nine)\"\nANSI_COLOR=\"0;38;2;60;110;180\"\nLOGO=fedora-logo-icon\nCPE_NAME=\"cpe:/o:fedoraproject:fedora:39\"\nDEFAULT_HOSTNAME=\"fedora\"\nHOME_URL=\"https://fedoraproject.org/\"\nDOCUMENTATION_URL=\"https://docs.fedoraproject.org/en-US/fedora/f39/system-administrators-guide/\"\nSUPPORT_URL=\"https://ask.fedoraproject.org/\"\nBUG_REPORT_URL=\"https://bugzilla.redhat.com/\"\nREDHAT_BUGZILLA_PRODUCT=\"Fedora\"\nREDHAT_BUGZILLA_PRODUCT_VERSION=39\nREDHAT_SUPPORT_PRODUCT=\"Fedora\"\nREDHAT_SUPPORT_PRODUCT_VERSION=39\nSUPPORT_END=2024-11-12\n"} <<< 28983 1726882970.62539: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882970.62542: stdout chunk (state=3): >>><<< 28983 1726882970.62545: stderr chunk (state=3): >>><<< 28983 1726882970.62547: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"Fedora Linux\"\nVERSION=\"39 (Thirty Nine)\"\nID=fedora\nVERSION_ID=39\nVERSION_CODENAME=\"\"\nPLATFORM_ID=\"platform:f39\"\nPRETTY_NAME=\"Fedora Linux 39 (Thirty Nine)\"\nANSI_COLOR=\"0;38;2;60;110;180\"\nLOGO=fedora-logo-icon\nCPE_NAME=\"cpe:/o:fedoraproject:fedora:39\"\nDEFAULT_HOSTNAME=\"fedora\"\nHOME_URL=\"https://fedoraproject.org/\"\nDOCUMENTATION_URL=\"https://docs.fedoraproject.org/en-US/fedora/f39/system-administrators-guide/\"\nSUPPORT_URL=\"https://ask.fedoraproject.org/\"\nBUG_REPORT_URL=\"https://bugzilla.redhat.com/\"\nREDHAT_BUGZILLA_PRODUCT=\"Fedora\"\nREDHAT_BUGZILLA_PRODUCT_VERSION=39\nREDHAT_SUPPORT_PRODUCT=\"Fedora\"\nREDHAT_SUPPORT_PRODUCT_VERSION=39\nSUPPORT_END=2024-11-12\n"} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726882970.62550: variable 'ansible_facts' from source: unknown 28983 1726882970.62552: variable 'ansible_facts' from source: unknown 28983 1726882970.62554: variable 'ansible_module_compression' from source: unknown 28983 1726882970.62556: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 28983 1726882970.62558: variable 'ansible_facts' from source: unknown 28983 1726882970.62704: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882970.1486735-29001-50054265961443/AnsiballZ_setup.py 28983 1726882970.62956: Sending initial data 28983 1726882970.62968: Sent initial data (153 bytes) 28983 1726882970.63475: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726882970.63495: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726882970.63547: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882970.63620: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726882970.63655: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726882970.63672: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882970.63777: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882970.65450: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726882970.65513: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726882970.65581: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmp7nvufgvv /root/.ansible/tmp/ansible-tmp-1726882970.1486735-29001-50054265961443/AnsiballZ_setup.py <<< 28983 1726882970.65584: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882970.1486735-29001-50054265961443/AnsiballZ_setup.py" <<< 28983 1726882970.65648: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmp7nvufgvv" to remote "/root/.ansible/tmp/ansible-tmp-1726882970.1486735-29001-50054265961443/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882970.1486735-29001-50054265961443/AnsiballZ_setup.py" <<< 28983 1726882970.67742: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882970.67795: stderr chunk (state=3): >>><<< 28983 1726882970.67798: stdout chunk (state=3): >>><<< 28983 1726882970.67821: done transferring module to remote 28983 1726882970.67836: _low_level_execute_command(): starting 28983 1726882970.67842: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882970.1486735-29001-50054265961443/ /root/.ansible/tmp/ansible-tmp-1726882970.1486735-29001-50054265961443/AnsiballZ_setup.py && sleep 0' 28983 1726882970.68264: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882970.68267: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882970.68273: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882970.68275: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882970.68323: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726882970.68326: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882970.68401: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882970.70390: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882970.70393: stdout chunk (state=3): >>><<< 28983 1726882970.70396: stderr chunk (state=3): >>><<< 28983 1726882970.70496: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726882970.70499: _low_level_execute_command(): starting 28983 1726882970.70501: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882970.1486735-29001-50054265961443/AnsiballZ_setup.py && sleep 0' 28983 1726882970.70914: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882970.70921: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726882970.70928: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882970.70959: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726882970.70962: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882970.71011: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726882970.71015: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882970.71106: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882970.73307: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 28983 1726882970.73374: stdout chunk (state=3): >>>import _imp # builtin <<< 28983 1726882970.73399: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 28983 1726882970.73470: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 28983 1726882970.73515: stdout chunk (state=3): >>>import 'posix' # <<< 28983 1726882970.73573: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 28983 1726882970.73576: stdout chunk (state=3): >>>import 'time' # <<< 28983 1726882970.73668: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 28983 1726882970.73671: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py <<< 28983 1726882970.73674: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # <<< 28983 1726882970.73676: stdout chunk (state=3): >>>import 'codecs' # <<< 28983 1726882970.73730: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 28983 1726882970.73760: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464fa0c530> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f9dbb30> <<< 28983 1726882970.73791: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' <<< 28983 1726882970.73818: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464fa0eab0> import '_signal' # <<< 28983 1726882970.73855: stdout chunk (state=3): >>>import '_abc' # import 'abc' # <<< 28983 1726882970.73893: stdout chunk (state=3): >>>import 'io' # <<< 28983 1726882970.73896: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 28983 1726882970.74006: stdout chunk (state=3): >>>import '_collections_abc' # <<< 28983 1726882970.74010: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 28983 1726882970.74090: stdout chunk (state=3): >>>import 'os' # <<< 28983 1726882970.74094: stdout chunk (state=3): >>>import '_sitebuiltins' # Processing user site-packages Processing global site-packages <<< 28983 1726882970.74156: stdout chunk (state=3): >>>Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 28983 1726882970.74197: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f821160> <<< 28983 1726882970.74226: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 28983 1726882970.74246: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f821fd0> <<< 28983 1726882970.74270: stdout chunk (state=3): >>>import 'site' # <<< 28983 1726882970.74298: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 7 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 28983 1726882970.74696: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 28983 1726882970.74708: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 28983 1726882970.74729: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 28983 1726882970.74733: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 28983 1726882970.74754: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 28983 1726882970.74798: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 28983 1726882970.74814: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 28983 1726882970.74841: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 28983 1726882970.74852: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f85fdd0> <<< 28983 1726882970.74869: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 28983 1726882970.74888: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 28983 1726882970.74913: stdout chunk (state=3): >>>import '_operator' # <<< 28983 1726882970.74918: stdout chunk (state=3): >>>import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f85ffe0> <<< 28983 1726882970.74942: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 28983 1726882970.74959: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 28983 1726882970.74993: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 28983 1726882970.75037: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 28983 1726882970.75061: stdout chunk (state=3): >>>import 'itertools' # <<< 28983 1726882970.75084: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f897800> <<< 28983 1726882970.75107: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py <<< 28983 1726882970.75120: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' <<< 28983 1726882970.75136: stdout chunk (state=3): >>>import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f897e90> <<< 28983 1726882970.75144: stdout chunk (state=3): >>>import '_collections' # <<< 28983 1726882970.75199: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f877aa0> <<< 28983 1726882970.75204: stdout chunk (state=3): >>>import '_functools' # <<< 28983 1726882970.75228: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f875190> <<< 28983 1726882970.75327: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f85cf80> <<< 28983 1726882970.75349: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 28983 1726882970.75380: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # <<< 28983 1726882970.75407: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 28983 1726882970.75427: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 28983 1726882970.75454: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py <<< 28983 1726882970.75459: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 28983 1726882970.75496: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f8bb710> <<< 28983 1726882970.75502: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f8ba330> <<< 28983 1726882970.75538: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' <<< 28983 1726882970.75544: stdout chunk (state=3): >>>import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f876060> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f8b8a40> <<< 28983 1726882970.75604: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 28983 1726882970.75611: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f8ec6e0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f85c200> <<< 28983 1726882970.75630: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 28983 1726882970.75668: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f464f8ecb90> <<< 28983 1726882970.75674: stdout chunk (state=3): >>>import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f8eca40> <<< 28983 1726882970.75709: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 28983 1726882970.75715: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f464f8ece00> <<< 28983 1726882970.75728: stdout chunk (state=3): >>>import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f85ad20> <<< 28983 1726882970.75752: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 28983 1726882970.75774: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 28983 1726882970.75810: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 28983 1726882970.75823: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f8ed4c0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f8ed190> <<< 28983 1726882970.75840: stdout chunk (state=3): >>>import 'importlib.machinery' # <<< 28983 1726882970.75861: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py <<< 28983 1726882970.75871: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 28983 1726882970.75887: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f8ee3c0> <<< 28983 1726882970.75896: stdout chunk (state=3): >>>import 'importlib.util' # <<< 28983 1726882970.75903: stdout chunk (state=3): >>>import 'runpy' # <<< 28983 1726882970.75925: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 28983 1726882970.75955: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 28983 1726882970.75987: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' <<< 28983 1726882970.75997: stdout chunk (state=3): >>>import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f9085c0> <<< 28983 1726882970.76002: stdout chunk (state=3): >>>import 'errno' # <<< 28983 1726882970.76039: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f464f909d00> <<< 28983 1726882970.76065: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py <<< 28983 1726882970.76070: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 28983 1726882970.76130: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 28983 1726882970.76152: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f90abd0> <<< 28983 1726882970.76188: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f464f90b230> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f90a120> <<< 28983 1726882970.76250: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 28983 1726882970.76268: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f464f90bc80> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f90b3b0> <<< 28983 1726882970.76316: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f8ee3f0> <<< 28983 1726882970.76350: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 28983 1726882970.76382: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 28983 1726882970.76437: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 28983 1726882970.76441: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f464f61bb30> <<< 28983 1726882970.76497: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f464f644620> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f644380> <<< 28983 1726882970.76548: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f464f644590> <<< 28983 1726882970.76556: stdout chunk (state=3): >>># extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f464f6447d0> <<< 28983 1726882970.76577: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f619cd0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 28983 1726882970.76719: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 28983 1726882970.76727: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 28983 1726882970.76730: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 28983 1726882970.76783: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f645e50> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f644b00> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f8eeae0> <<< 28983 1726882970.76786: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 28983 1726882970.76875: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 28983 1726882970.76917: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 28983 1726882970.76928: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f6761e0> <<< 28983 1726882970.76991: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 28983 1726882970.77007: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 28983 1726882970.77026: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 28983 1726882970.77033: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 28983 1726882970.77084: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f68e390> <<< 28983 1726882970.77104: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 28983 1726882970.77143: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 28983 1726882970.77203: stdout chunk (state=3): >>>import 'ntpath' # <<< 28983 1726882970.77225: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py <<< 28983 1726882970.77233: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f6cb110> <<< 28983 1726882970.77249: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 28983 1726882970.77282: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 28983 1726882970.77303: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 28983 1726882970.77348: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 28983 1726882970.77437: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f6f18b0> <<< 28983 1726882970.77512: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f6cb230> <<< 28983 1726882970.77554: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f68f020> <<< 28983 1726882970.77586: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f5101d0> <<< 28983 1726882970.77602: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f68d3d0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f646d80> <<< 28983 1726882970.77776: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 28983 1726882970.77793: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f464f5103b0> <<< 28983 1726882970.77975: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_ot0l9ldf/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available <<< 28983 1726882970.78138: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.78161: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 28983 1726882970.78182: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 28983 1726882970.78219: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 28983 1726882970.78303: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 28983 1726882970.78335: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py <<< 28983 1726882970.78339: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f575eb0> <<< 28983 1726882970.78349: stdout chunk (state=3): >>>import '_typing' # <<< 28983 1726882970.78552: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f54cda0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f513ec0> <<< 28983 1726882970.78560: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.78584: stdout chunk (state=3): >>>import 'ansible' # <<< 28983 1726882970.78601: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.78619: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.78640: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.78646: stdout chunk (state=3): >>>import 'ansible.module_utils' # <<< 28983 1726882970.78666: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.80229: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.81529: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f54fd10> <<< 28983 1726882970.81552: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py <<< 28983 1726882970.81558: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 28983 1726882970.81588: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py <<< 28983 1726882970.81597: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 28983 1726882970.81619: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py <<< 28983 1726882970.81625: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 28983 1726882970.81657: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' <<< 28983 1726882970.81663: stdout chunk (state=3): >>>import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f464f5a57f0> <<< 28983 1726882970.81699: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f5a5580> <<< 28983 1726882970.81727: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f5a4ec0> <<< 28983 1726882970.81757: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py <<< 28983 1726882970.81761: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 28983 1726882970.81805: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f5a5970> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f5768d0> <<< 28983 1726882970.81822: stdout chunk (state=3): >>>import 'atexit' # <<< 28983 1726882970.81852: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' <<< 28983 1726882970.81857: stdout chunk (state=3): >>># extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f464f5a6570> <<< 28983 1726882970.81882: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' <<< 28983 1726882970.81890: stdout chunk (state=3): >>># extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f464f5a67b0> <<< 28983 1726882970.81912: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 28983 1726882970.81984: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # <<< 28983 1726882970.82040: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f5a6cf0> <<< 28983 1726882970.82065: stdout chunk (state=3): >>>import 'pwd' # <<< 28983 1726882970.82081: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 28983 1726882970.82132: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f40ca70> <<< 28983 1726882970.82174: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f464f40e690> <<< 28983 1726882970.82196: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 28983 1726882970.82245: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f40f020> <<< 28983 1726882970.82266: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 28983 1726882970.82312: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 28983 1726882970.82316: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f40ffb0> <<< 28983 1726882970.82364: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 28983 1726882970.82389: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 28983 1726882970.82404: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 28983 1726882970.82470: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f412c30> <<< 28983 1726882970.82497: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f464f412f60> <<< 28983 1726882970.82520: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f410ef0> <<< 28983 1726882970.82567: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 28983 1726882970.82585: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 28983 1726882970.82602: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 28983 1726882970.82629: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 28983 1726882970.82672: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 28983 1726882970.82699: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f416a80> import '_tokenize' # <<< 28983 1726882970.82802: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f415550> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f4152b0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 28983 1726882970.82870: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f417b60> <<< 28983 1726882970.82900: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f411370> <<< 28983 1726882970.82935: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f464f45ac90> <<< 28983 1726882970.82974: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f45ae70> <<< 28983 1726882970.82992: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 28983 1726882970.83009: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 28983 1726882970.83032: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 28983 1726882970.83069: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' <<< 28983 1726882970.83075: stdout chunk (state=3): >>># extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f464f45c890> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f45c650> <<< 28983 1726882970.83097: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 28983 1726882970.83217: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 28983 1726882970.83270: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f464f45ede0> <<< 28983 1726882970.83276: stdout chunk (state=3): >>>import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f45cf80> <<< 28983 1726882970.83297: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 28983 1726882970.83357: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 28983 1726882970.83370: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 28983 1726882970.83389: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' <<< 28983 1726882970.83395: stdout chunk (state=3): >>>import '_string' # <<< 28983 1726882970.83453: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f46a600> <<< 28983 1726882970.83606: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f45ef90> <<< 28983 1726882970.83699: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f464f46b8c0> <<< 28983 1726882970.83717: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' <<< 28983 1726882970.83723: stdout chunk (state=3): >>># extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f464f46b770> <<< 28983 1726882970.83775: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' <<< 28983 1726882970.83785: stdout chunk (state=3): >>># extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f464f46ba10> <<< 28983 1726882970.83807: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f45af60> <<< 28983 1726882970.83817: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py <<< 28983 1726882970.83825: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 28983 1726882970.83844: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 28983 1726882970.83874: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 28983 1726882970.83902: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 28983 1726882970.83937: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f464f46ea80> <<< 28983 1726882970.84140: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 28983 1726882970.84147: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f464f46fd70> <<< 28983 1726882970.84154: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f46d220> <<< 28983 1726882970.84194: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f464f46e0f0> <<< 28983 1726882970.84205: stdout chunk (state=3): >>>import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f46cda0> <<< 28983 1726882970.84225: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.84228: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat' # <<< 28983 1726882970.84260: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.84363: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.84479: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.84486: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.84496: stdout chunk (state=3): >>>import 'ansible.module_utils.common' # <<< 28983 1726882970.84513: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.84523: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.84534: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text' # <<< 28983 1726882970.84550: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.84688: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.84832: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.85592: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.86208: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 28983 1726882970.86225: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 28983 1726882970.86256: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 28983 1726882970.86268: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 28983 1726882970.86339: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f464f2f9550> <<< 28983 1726882970.86459: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 28983 1726882970.86477: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f2f8e60> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f575e20> <<< 28983 1726882970.86554: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 28983 1726882970.86557: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.86588: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils._text' # <<< 28983 1726882970.86609: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.86793: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.86989: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 28983 1726882970.86994: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f2f9490> <<< 28983 1726882970.87015: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.87576: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.88126: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.88219: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.88315: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 28983 1726882970.88335: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.88361: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.88416: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 28983 1726882970.88419: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.88505: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.88625: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 28983 1726882970.88652: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.88674: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available <<< 28983 1726882970.88724: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.88759: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 28983 1726882970.88788: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.89062: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.89358: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 28983 1726882970.89430: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 28983 1726882970.89463: stdout chunk (state=3): >>>import '_ast' # <<< 28983 1726882970.89553: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f2fb410> <<< 28983 1726882970.89557: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.89633: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.89730: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # <<< 28983 1726882970.89772: stdout chunk (state=3): >>>import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # <<< 28983 1726882970.89775: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py <<< 28983 1726882970.89807: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 28983 1726882970.89872: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 28983 1726882970.89997: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f464f301b20> <<< 28983 1726882970.90068: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f464f302420> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f2fa420> <<< 28983 1726882970.90093: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.90142: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.90193: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 28983 1726882970.90197: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.90246: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.90291: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.90354: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.90421: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 28983 1726882970.90481: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 28983 1726882970.90577: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f464f3010a0> <<< 28983 1726882970.90613: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f302600> <<< 28983 1726882970.90662: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # <<< 28983 1726882970.90676: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.90741: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.90827: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.90843: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.90891: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 28983 1726882970.90946: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 28983 1726882970.90949: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 28983 1726882970.90969: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 28983 1726882970.91031: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 28983 1726882970.91069: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 28983 1726882970.91125: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f3965d0> <<< 28983 1726882970.91183: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f30c230> <<< 28983 1726882970.91280: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f3063c0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f300f50> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 28983 1726882970.91312: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.91340: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.91356: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 28983 1726882970.91432: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 28983 1726882970.91453: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # <<< 28983 1726882970.91468: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.91530: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.91852: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available <<< 28983 1726882970.91904: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.91987: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.92006: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.92069: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # <<< 28983 1726882970.92072: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.92274: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.92463: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.92512: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.92581: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 28983 1726882970.92615: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py <<< 28983 1726882970.92640: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' <<< 28983 1726882970.92671: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 28983 1726882970.92714: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f39cbf0> <<< 28983 1726882970.92717: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py <<< 28983 1726882970.92749: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 28983 1726882970.92792: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 28983 1726882970.92820: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py <<< 28983 1726882970.92850: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464e8fbc80> <<< 28983 1726882970.92890: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 28983 1726882970.92904: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f464e8fbf50> <<< 28983 1726882970.92983: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f310d10> <<< 28983 1726882970.92991: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f310110> <<< 28983 1726882970.93015: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f39ea20> <<< 28983 1726882970.93029: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f39e240> <<< 28983 1726882970.93056: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 28983 1726882970.93114: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 28983 1726882970.93138: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 28983 1726882970.93181: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py <<< 28983 1726882970.93191: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 28983 1726882970.93260: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f464e913080> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464e912930> <<< 28983 1726882970.93264: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' <<< 28983 1726882970.93268: stdout chunk (state=3): >>># extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f464e912b10> <<< 28983 1726882970.93300: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464e911d90> <<< 28983 1726882970.93306: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 28983 1726882970.93443: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' <<< 28983 1726882970.93446: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464e913200> <<< 28983 1726882970.93457: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 28983 1726882970.93482: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 28983 1726882970.93511: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f464e971d00> <<< 28983 1726882970.93556: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464e913ce0> <<< 28983 1726882970.93620: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f39c380> import 'ansible.module_utils.facts.timeout' # <<< 28983 1726882970.93627: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.collector' # # zipimport: zlib available <<< 28983 1726882970.93646: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.other' # <<< 28983 1726882970.93661: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.93710: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.93775: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 28983 1726882970.93798: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.93842: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.93928: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 28983 1726882970.93931: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # <<< 28983 1726882970.93967: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.93970: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.94016: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available <<< 28983 1726882970.94064: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.94121: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # <<< 28983 1726882970.94138: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.94168: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.94222: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # <<< 28983 1726882970.94230: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.94284: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.94353: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.94404: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.94482: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # <<< 28983 1726882970.94497: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.95054: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.95571: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # <<< 28983 1726882970.95581: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.95631: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.95694: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.95727: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.95775: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # <<< 28983 1726882970.95810: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.95822: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.95847: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # <<< 28983 1726882970.95862: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.95910: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.95971: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 28983 1726882970.95993: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.96019: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.96057: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # <<< 28983 1726882970.96060: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.96086: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.96175: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available <<< 28983 1726882970.96220: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.96321: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 28983 1726882970.96360: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464e972f00> <<< 28983 1726882970.96373: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 28983 1726882970.96395: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 28983 1726882970.96537: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464e9724b0> <<< 28983 1726882970.96550: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available <<< 28983 1726882970.96610: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.96694: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 28983 1726882970.96714: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.96795: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.96904: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 28983 1726882970.96907: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.96972: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.97056: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available <<< 28983 1726882970.97105: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.97160: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 28983 1726882970.97205: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 28983 1726882970.97282: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 28983 1726882970.97347: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f464e99dee0> <<< 28983 1726882970.97555: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464e98c230> <<< 28983 1726882970.97568: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available <<< 28983 1726882970.97626: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.97693: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available <<< 28983 1726882970.97789: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.97876: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.98049: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.98277: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available <<< 28983 1726882970.98315: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.98386: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 28983 1726882970.98413: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 28983 1726882970.98497: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f464e785d60> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464e785940> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available <<< 28983 1726882970.98517: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available <<< 28983 1726882970.98545: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.98598: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # <<< 28983 1726882970.98602: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.98802: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.98943: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available <<< 28983 1726882970.99058: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.99171: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.99227: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.99495: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # <<< 28983 1726882970.99499: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available <<< 28983 1726882970.99502: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 28983 1726882970.99506: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.99660: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # <<< 28983 1726882970.99690: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.99804: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882970.99952: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 28983 1726882970.99956: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882971.00024: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882971.00027: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882971.00691: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882971.01284: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # <<< 28983 1726882971.01303: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882971.01422: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882971.01770: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available <<< 28983 1726882971.01972: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882971.02116: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 28983 1726882971.02149: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 28983 1726882971.02169: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network' # # zipimport: zlib available <<< 28983 1726882971.02212: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882971.02266: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available <<< 28983 1726882971.02522: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 28983 1726882971.02725: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882971.02965: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # <<< 28983 1726882971.02988: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882971.03011: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882971.03058: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # <<< 28983 1726882971.03087: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882971.03122: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available <<< 28983 1726882971.03211: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882971.03292: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # <<< 28983 1726882971.03315: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882971.03342: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882971.03432: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available <<< 28983 1726882971.03494: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # <<< 28983 1726882971.03507: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882971.03563: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882971.03628: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 28983 1726882971.03748: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882971.03943: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882971.04242: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 28983 1726882971.04257: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882971.04315: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882971.04404: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 28983 1726882971.04429: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 28983 1726882971.04524: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available <<< 28983 1726882971.04565: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # <<< 28983 1726882971.04568: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882971.04597: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882971.04754: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # <<< 28983 1726882971.04758: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882971.04760: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882971.04830: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 28983 1726882971.04863: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # <<< 28983 1726882971.04911: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882971.04923: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882971.04976: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # <<< 28983 1726882971.05014: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 28983 1726882971.05065: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882971.05084: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882971.05132: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882971.05212: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882971.05311: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # <<< 28983 1726882971.05351: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882971.05390: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882971.05505: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # <<< 28983 1726882971.05523: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882971.05663: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882971.05888: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 28983 1726882971.05907: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882971.05954: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882971.05997: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # <<< 28983 1726882971.06050: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882971.06066: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882971.06155: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # <<< 28983 1726882971.06162: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882971.06215: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882971.06317: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # <<< 28983 1726882971.06333: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available <<< 28983 1726882971.06426: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882971.06765: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 28983 1726882971.06769: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882971.07159: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 28983 1726882971.07185: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py <<< 28983 1726882971.07216: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 28983 1726882971.07244: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f464e7b2870> <<< 28983 1726882971.07258: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464e7b01d0> <<< 28983 1726882971.07305: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464e7b2a80> <<< 28983 1726882971.25106: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py <<< 28983 1726882971.25148: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464e7f8e60> <<< 28983 1726882971.25177: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' <<< 28983 1726882971.25202: stdout chunk (state=3): >>>import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464e7f9be0> <<< 28983 1726882971.25258: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' <<< 28983 1726882971.25328: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' <<< 28983 1726882971.25581: stdout chunk (state=3): >>>import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464e8482f0> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464e849430> <<< 28983 1726882971.25616: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 28983 1726882971.45953: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.10.9-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 9 02:28:01 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-46-139.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-46-139", "ansible_nodename": "ip-10-31-46-139.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2c52174af731fc996c81a6a9338a65", "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAMAGmz4coceQATe4wVCPswNHKDq397sHN0wP3lxcDx5YXYj3mfO2mlh1Dpa7TCdRgLKwtozxXO6KafFlS3d0x9UWzSyKR0sSc77mhua/o3Y8EThq+wmVYqwwMQp1Vh8aBTvONV8N1UHqLp3aOdJIjHMGYdoUzUEF7xedcrV0fOV9AAAAFQDiC9S/VmOYdv/C8sXiFstIvsP/FQAAAIEApapvkLljxqN9GCi5UqXohiznCnndWFY9Vt/4wN+GtUjkuNJBqYHErEZCKfujpgVR94wM4sP3DbiJkL+OurGNHPJn7qrXDGQNIKExN7q3EzJI6yKBYdq1pnuhK1fBE/B8I/GQAEoqP3PMoutNlf85wWVgmt1DBc+D9D87BEGZzFoAAACBAIyk6Zb39dUz0T2fpmnSTF7AJHxsuBXwGZH1/5c5tWS0QGhwu5nzEoJUkQLhk+JqFJVRjNKoZ8wzH8N32ZrE15HfLF6/uIlfBorDH5AhDSnVumVmGZtYAerr8Cch5xqDXZSHTUhi7nBmdY/IKTgk7lCs0q4c7ja/wOueEHXkfWdF", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDjr2n9pN3Skd0p3fTYngf1LkuFARyHo/RDL4n6yhKcuFMhqe9MjqKkyMOvWaeNvTMkAiQ7TNEROz2mBqoPSot74UYaDR1nw9xQRq7skd9l/L6FlWrbg6EBCcQZUgkgjucBgmk3+INE0QVUdywVyW2IdmqayH/fLojViFulCLWlWn9cjFclC/t+sfMfoY6DrRIoi1GlfdlEfEHT9zGqC5syJrp6Yb3b9Ho/CYNAXya6aAHzMTLkx0/kU4czCptGZ9ew7HWLOtMv8iahxGrAp1VW4jj76+SZ5OisJ9N7+g8GPnaNAvsDNldGNQJWME6YNcEbxHblmHAEU0lq9EydM2W5iQHUnSezOSqQBljsiUACwwxZSphsqFYQnsHv4Vl/NlVTAJOApkU0VehWPUtOQNiqG+W/VGFMqqBksxF5tVDTO+qkvF5bm8JT2RSHAIbRpPPOYA8fg1PEPu1ONXD99Jn3urd0Y0kvUfp2NzPk1JFbxcGh+uDNHR2t5bVuOyRvmy0=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKj0SRB0MBwzo3XKoUBfi6MCOa8n8Z6sjosikvEKYLTWy/hzFaSt2hhtv0qoPi/CAERuCNgGQ5pZPiqBpnr9C8A=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIHVmW9cqR4t5U02ebXgqIiDjJ0aeuxmuwOiTXv538jBQ", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-100.fc39.x86_64", "root": "UUID=f92a5a40-e33d-4a6f-8746-997eff27cfbd", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-100.fc39.x86_64", "root": "UUID=f92a5a40-e33d-4a6f-8746-997eff27cfbd", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_iscsi_iqn": "", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_lsb": {}, "ansible_local": {}, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "02:98:65:d3:42:6b", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.46.139", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::2f3a:b84:7c06:1e06", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_check<<< 28983 1726882971.46016: stdout chunk (state=3): >>>summing": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.46.139", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:98:65:d3:42:6b", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.46.139"], "ansible_all_ipv6_addresses": ["fe80::2f3a:b84:7c06:1e06"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.46.139", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::2f3a:b84:7c06:1e06"]}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2841, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 876, "free": 2841}, "nocache": {"free": 3461, "used": 256}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2c5217-4af7-31fc-996c-81a6a9338a65", "ansible_product_uuid": "ec2c5217-4af7-31fc-996c-81a6a9338a65", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["f92a5a40-e33d-4a6f-8746-997eff27cfbd"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "f92a5a40-e33d-4a6f-8746-997eff27cfbd", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["f92a5a40-e33d-4a6f-8746-997eff27cfbd"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 934, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264124022784, "size_available": 251199463424, "block_size": 4096, "block_total": 64483404, "block_available": 61327994, "block_used": 3155410, "inode_total": 16384000, "inode_available": 16303512, "inode_used": 80488, "uuid": "f92a5a40-e33d-4a6f-8746-997eff27cfbd"}], "ansible_service_mgr": "systemd", "ansible_hostnqn": "", "ansible_fibre_channel_wwn": [], "ansible_fips": false, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "42", "second": "51", "epoch": "1726882971", "epoch_int": "1726882971", "date": "2024-09-20", "time": "21:42:51", "iso8601_micro": "2024-09-21T01:42:51.455644Z", "iso8601": "2024-09-21T01:42:51Z", "iso8601_basic": "20240920T214251455644", "iso8601_basic_short": "20240920T214251", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_loadavg": {"1m": 1.00146484375, "5m": 0.8857421875, "15m": 0.48779296875}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_pkg_mgr": "dnf", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.145 47942 10.31.46.139 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.145 47942 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_is_chroot": false, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_apparmor": {"status": "disabled"}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 28983 1726882971.46595: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path <<< 28983 1726882971.46643: stdout chunk (state=3): >>># clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path <<< 28983 1726882971.46724: stdout chunk (state=3): >>># cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner <<< 28983 1726882971.46775: stdout chunk (state=3): >>># cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors <<< 28983 1726882971.46838: stdout chunk (state=3): >>># destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing <<< 28983 1726882971.46907: stdout chunk (state=3): >>># cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor <<< 28983 1726882971.46964: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd <<< 28983 1726882971.46992: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 28983 1726882971.47354: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 28983 1726882971.47399: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path <<< 28983 1726882971.47511: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath <<< 28983 1726882971.47515: stdout chunk (state=3): >>># destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib <<< 28983 1726882971.47589: stdout chunk (state=3): >>># destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid <<< 28983 1726882971.47596: stdout chunk (state=3): >>># destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil <<< 28983 1726882971.47680: stdout chunk (state=3): >>># destroy distro # destroy distro.distro # destroy argparse # destroy logging <<< 28983 1726882971.47728: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue <<< 28983 1726882971.47796: stdout chunk (state=3): >>># destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios <<< 28983 1726882971.47867: stdout chunk (state=3): >>># destroy json # destroy socket # destroy struct <<< 28983 1726882971.47880: stdout chunk (state=3): >>># destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection <<< 28983 1726882971.47962: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux <<< 28983 1726882971.48043: stdout chunk (state=3): >>># cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading <<< 28983 1726882971.48161: stdout chunk (state=3): >>># cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath <<< 28983 1726882971.48165: stdout chunk (state=3): >>># cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix <<< 28983 1726882971.48181: stdout chunk (state=3): >>># cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 28983 1726882971.48338: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 28983 1726882971.48394: stdout chunk (state=3): >>># destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser <<< 28983 1726882971.48421: stdout chunk (state=3): >>># destroy tokenize <<< 28983 1726882971.48477: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 28983 1726882971.48517: stdout chunk (state=3): >>># destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 28983 1726882971.48521: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 28983 1726882971.48652: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time <<< 28983 1726882971.48696: stdout chunk (state=3): >>># destroy _random # destroy _weakref <<< 28983 1726882971.48739: stdout chunk (state=3): >>># destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins <<< 28983 1726882971.48756: stdout chunk (state=3): >>># destroy _thread # clear sys.audit hooks <<< 28983 1726882971.49302: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. <<< 28983 1726882971.49305: stdout chunk (state=3): >>><<< 28983 1726882971.49308: stderr chunk (state=3): >>><<< 28983 1726882971.49653: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464fa0c530> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f9dbb30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464fa0eab0> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f821160> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f821fd0> import 'site' # Python 3.12.5 (main, Aug 7 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f85fdd0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f85ffe0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f897800> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f897e90> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f877aa0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f875190> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f85cf80> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f8bb710> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f8ba330> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f876060> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f8b8a40> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f8ec6e0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f85c200> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f464f8ecb90> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f8eca40> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f464f8ece00> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f85ad20> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f8ed4c0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f8ed190> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f8ee3c0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f9085c0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f464f909d00> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f90abd0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f464f90b230> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f90a120> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f464f90bc80> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f90b3b0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f8ee3f0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f464f61bb30> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f464f644620> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f644380> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f464f644590> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f464f6447d0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f619cd0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f645e50> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f644b00> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f8eeae0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f6761e0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f68e390> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f6cb110> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f6f18b0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f6cb230> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f68f020> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f5101d0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f68d3d0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f646d80> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f464f5103b0> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_ot0l9ldf/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f575eb0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f54cda0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f513ec0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f54fd10> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f464f5a57f0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f5a5580> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f5a4ec0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f5a5970> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f5768d0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f464f5a6570> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f464f5a67b0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f5a6cf0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f40ca70> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f464f40e690> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f40f020> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f40ffb0> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f412c30> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f464f412f60> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f410ef0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f416a80> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f415550> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f4152b0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f417b60> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f411370> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f464f45ac90> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f45ae70> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f464f45c890> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f45c650> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f464f45ede0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f45cf80> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f46a600> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f45ef90> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f464f46b8c0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f464f46b770> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f464f46ba10> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f45af60> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f464f46ea80> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f464f46fd70> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f46d220> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f464f46e0f0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f46cda0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f464f2f9550> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f2f8e60> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f575e20> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f2f9490> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f2fb410> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f464f301b20> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f464f302420> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f2fa420> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f464f3010a0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f302600> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f3965d0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f30c230> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f3063c0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f300f50> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f39cbf0> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464e8fbc80> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f464e8fbf50> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f310d10> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f310110> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f39ea20> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f39e240> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f464e913080> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464e912930> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f464e912b10> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464e911d90> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464e913200> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f464e971d00> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464e913ce0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464f39c380> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464e972f00> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464e9724b0> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f464e99dee0> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464e98c230> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f464e785d60> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464e785940> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f464e7b2870> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464e7b01d0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464e7b2a80> # /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464e7f8e60> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464e7f9be0> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464e8482f0> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f464e849430> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.10.9-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 9 02:28:01 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-46-139.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-46-139", "ansible_nodename": "ip-10-31-46-139.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2c52174af731fc996c81a6a9338a65", "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAMAGmz4coceQATe4wVCPswNHKDq397sHN0wP3lxcDx5YXYj3mfO2mlh1Dpa7TCdRgLKwtozxXO6KafFlS3d0x9UWzSyKR0sSc77mhua/o3Y8EThq+wmVYqwwMQp1Vh8aBTvONV8N1UHqLp3aOdJIjHMGYdoUzUEF7xedcrV0fOV9AAAAFQDiC9S/VmOYdv/C8sXiFstIvsP/FQAAAIEApapvkLljxqN9GCi5UqXohiznCnndWFY9Vt/4wN+GtUjkuNJBqYHErEZCKfujpgVR94wM4sP3DbiJkL+OurGNHPJn7qrXDGQNIKExN7q3EzJI6yKBYdq1pnuhK1fBE/B8I/GQAEoqP3PMoutNlf85wWVgmt1DBc+D9D87BEGZzFoAAACBAIyk6Zb39dUz0T2fpmnSTF7AJHxsuBXwGZH1/5c5tWS0QGhwu5nzEoJUkQLhk+JqFJVRjNKoZ8wzH8N32ZrE15HfLF6/uIlfBorDH5AhDSnVumVmGZtYAerr8Cch5xqDXZSHTUhi7nBmdY/IKTgk7lCs0q4c7ja/wOueEHXkfWdF", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDjr2n9pN3Skd0p3fTYngf1LkuFARyHo/RDL4n6yhKcuFMhqe9MjqKkyMOvWaeNvTMkAiQ7TNEROz2mBqoPSot74UYaDR1nw9xQRq7skd9l/L6FlWrbg6EBCcQZUgkgjucBgmk3+INE0QVUdywVyW2IdmqayH/fLojViFulCLWlWn9cjFclC/t+sfMfoY6DrRIoi1GlfdlEfEHT9zGqC5syJrp6Yb3b9Ho/CYNAXya6aAHzMTLkx0/kU4czCptGZ9ew7HWLOtMv8iahxGrAp1VW4jj76+SZ5OisJ9N7+g8GPnaNAvsDNldGNQJWME6YNcEbxHblmHAEU0lq9EydM2W5iQHUnSezOSqQBljsiUACwwxZSphsqFYQnsHv4Vl/NlVTAJOApkU0VehWPUtOQNiqG+W/VGFMqqBksxF5tVDTO+qkvF5bm8JT2RSHAIbRpPPOYA8fg1PEPu1ONXD99Jn3urd0Y0kvUfp2NzPk1JFbxcGh+uDNHR2t5bVuOyRvmy0=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKj0SRB0MBwzo3XKoUBfi6MCOa8n8Z6sjosikvEKYLTWy/hzFaSt2hhtv0qoPi/CAERuCNgGQ5pZPiqBpnr9C8A=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIHVmW9cqR4t5U02ebXgqIiDjJ0aeuxmuwOiTXv538jBQ", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-100.fc39.x86_64", "root": "UUID=f92a5a40-e33d-4a6f-8746-997eff27cfbd", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-100.fc39.x86_64", "root": "UUID=f92a5a40-e33d-4a6f-8746-997eff27cfbd", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_iscsi_iqn": "", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_lsb": {}, "ansible_local": {}, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "02:98:65:d3:42:6b", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.46.139", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::2f3a:b84:7c06:1e06", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.46.139", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:98:65:d3:42:6b", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.46.139"], "ansible_all_ipv6_addresses": ["fe80::2f3a:b84:7c06:1e06"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.46.139", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::2f3a:b84:7c06:1e06"]}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2841, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 876, "free": 2841}, "nocache": {"free": 3461, "used": 256}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2c5217-4af7-31fc-996c-81a6a9338a65", "ansible_product_uuid": "ec2c5217-4af7-31fc-996c-81a6a9338a65", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["f92a5a40-e33d-4a6f-8746-997eff27cfbd"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "f92a5a40-e33d-4a6f-8746-997eff27cfbd", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["f92a5a40-e33d-4a6f-8746-997eff27cfbd"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 934, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264124022784, "size_available": 251199463424, "block_size": 4096, "block_total": 64483404, "block_available": 61327994, "block_used": 3155410, "inode_total": 16384000, "inode_available": 16303512, "inode_used": 80488, "uuid": "f92a5a40-e33d-4a6f-8746-997eff27cfbd"}], "ansible_service_mgr": "systemd", "ansible_hostnqn": "", "ansible_fibre_channel_wwn": [], "ansible_fips": false, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "42", "second": "51", "epoch": "1726882971", "epoch_int": "1726882971", "date": "2024-09-20", "time": "21:42:51", "iso8601_micro": "2024-09-21T01:42:51.455644Z", "iso8601": "2024-09-21T01:42:51Z", "iso8601_basic": "20240920T214251455644", "iso8601_basic_short": "20240920T214251", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_loadavg": {"1m": 1.00146484375, "5m": 0.8857421875, "15m": 0.48779296875}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_pkg_mgr": "dnf", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.145 47942 10.31.46.139 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.145 47942 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_is_chroot": false, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_apparmor": {"status": "disabled"}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks [WARNING]: Platform linux on host managed_node2 is using the discovered Python interpreter at /usr/bin/python3.12, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 28983 1726882971.51468: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882970.1486735-29001-50054265961443/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726882971.51471: _low_level_execute_command(): starting 28983 1726882971.51474: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882970.1486735-29001-50054265961443/ > /dev/null 2>&1 && sleep 0' 28983 1726882971.52040: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726882971.52044: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726882971.52047: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882971.52155: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882971.52159: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726882971.52163: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882971.52269: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882971.54241: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882971.54326: stderr chunk (state=3): >>><<< 28983 1726882971.54344: stdout chunk (state=3): >>><<< 28983 1726882971.54540: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726882971.54544: handler run complete 28983 1726882971.54581: variable 'ansible_facts' from source: unknown 28983 1726882971.54726: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882971.55250: variable 'ansible_facts' from source: unknown 28983 1726882971.55385: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882971.55680: attempt loop complete, returning result 28983 1726882971.55692: _execute() done 28983 1726882971.55701: dumping result to json 28983 1726882971.55756: done dumping result, returning 28983 1726882971.55771: done running TaskExecutor() for managed_node2/TASK: Gathering Facts [0affe814-3a2d-b16d-c0a7-00000000001b] 28983 1726882971.55781: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000001b 28983 1726882971.57070: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000001b 28983 1726882971.57080: WORKER PROCESS EXITING ok: [managed_node2] 28983 1726882971.57292: no more pending results, returning what we have 28983 1726882971.57295: results queue empty 28983 1726882971.57296: checking for any_errors_fatal 28983 1726882971.57298: done checking for any_errors_fatal 28983 1726882971.57299: checking for max_fail_percentage 28983 1726882971.57301: done checking for max_fail_percentage 28983 1726882971.57302: checking to see if all hosts have failed and the running result is not ok 28983 1726882971.57303: done checking to see if all hosts have failed 28983 1726882971.57304: getting the remaining hosts for this loop 28983 1726882971.57306: done getting the remaining hosts for this loop 28983 1726882971.57310: getting the next task for host managed_node2 28983 1726882971.57317: done getting next task for host managed_node2 28983 1726882971.57320: ^ task is: TASK: meta (flush_handlers) 28983 1726882971.57322: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882971.57327: getting variables 28983 1726882971.57328: in VariableManager get_vars() 28983 1726882971.57358: Calling all_inventory to load vars for managed_node2 28983 1726882971.57362: Calling groups_inventory to load vars for managed_node2 28983 1726882971.57366: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882971.57376: Calling all_plugins_play to load vars for managed_node2 28983 1726882971.57379: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882971.57389: Calling groups_plugins_play to load vars for managed_node2 28983 1726882971.57700: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882971.58041: done with get_vars() 28983 1726882971.58055: done getting variables 28983 1726882971.58150: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ 28983 1726882971.58224: in VariableManager get_vars() 28983 1726882971.58238: Calling all_inventory to load vars for managed_node2 28983 1726882971.58241: Calling groups_inventory to load vars for managed_node2 28983 1726882971.58244: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882971.58250: Calling all_plugins_play to load vars for managed_node2 28983 1726882971.58258: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882971.58263: Calling groups_plugins_play to load vars for managed_node2 28983 1726882971.58607: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882971.58914: done with get_vars() 28983 1726882971.58930: done queuing things up, now waiting for results queue to drain 28983 1726882971.58932: results queue empty 28983 1726882971.58935: checking for any_errors_fatal 28983 1726882971.58938: done checking for any_errors_fatal 28983 1726882971.58939: checking for max_fail_percentage 28983 1726882971.58940: done checking for max_fail_percentage 28983 1726882971.58946: checking to see if all hosts have failed and the running result is not ok 28983 1726882971.58947: done checking to see if all hosts have failed 28983 1726882971.58948: getting the remaining hosts for this loop 28983 1726882971.58949: done getting the remaining hosts for this loop 28983 1726882971.58952: getting the next task for host managed_node2 28983 1726882971.58957: done getting next task for host managed_node2 28983 1726882971.58960: ^ task is: TASK: Include the task 'el_repo_setup.yml' 28983 1726882971.58961: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882971.58964: getting variables 28983 1726882971.58965: in VariableManager get_vars() 28983 1726882971.58975: Calling all_inventory to load vars for managed_node2 28983 1726882971.58977: Calling groups_inventory to load vars for managed_node2 28983 1726882971.58980: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882971.58985: Calling all_plugins_play to load vars for managed_node2 28983 1726882971.58988: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882971.58991: Calling groups_plugins_play to load vars for managed_node2 28983 1726882971.59230: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882971.59546: done with get_vars() 28983 1726882971.59557: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tests_states_nm.yml:11 Friday 20 September 2024 21:42:51 -0400 (0:00:01.528) 0:00:01.594 ****** 28983 1726882971.59654: entering _queue_task() for managed_node2/include_tasks 28983 1726882971.59657: Creating lock for include_tasks 28983 1726882971.60120: worker is 1 (out of 1 available) 28983 1726882971.60136: exiting _queue_task() for managed_node2/include_tasks 28983 1726882971.60149: done queuing things up, now waiting for results queue to drain 28983 1726882971.60151: waiting for pending results... 28983 1726882971.60313: running TaskExecutor() for managed_node2/TASK: Include the task 'el_repo_setup.yml' 28983 1726882971.60421: in run() - task 0affe814-3a2d-b16d-c0a7-000000000006 28983 1726882971.60449: variable 'ansible_search_path' from source: unknown 28983 1726882971.60498: calling self._execute() 28983 1726882971.60586: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882971.60603: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882971.60619: variable 'omit' from source: magic vars 28983 1726882971.60749: _execute() done 28983 1726882971.60759: dumping result to json 28983 1726882971.60773: done dumping result, returning 28983 1726882971.60784: done running TaskExecutor() for managed_node2/TASK: Include the task 'el_repo_setup.yml' [0affe814-3a2d-b16d-c0a7-000000000006] 28983 1726882971.60795: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000006 28983 1726882971.60967: no more pending results, returning what we have 28983 1726882971.60973: in VariableManager get_vars() 28983 1726882971.61170: Calling all_inventory to load vars for managed_node2 28983 1726882971.61174: Calling groups_inventory to load vars for managed_node2 28983 1726882971.61179: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882971.61192: Calling all_plugins_play to load vars for managed_node2 28983 1726882971.61196: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882971.61200: Calling groups_plugins_play to load vars for managed_node2 28983 1726882971.61532: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000006 28983 1726882971.61537: WORKER PROCESS EXITING 28983 1726882971.61568: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882971.61917: done with get_vars() 28983 1726882971.61927: variable 'ansible_search_path' from source: unknown 28983 1726882971.61944: we have included files to process 28983 1726882971.61945: generating all_blocks data 28983 1726882971.61947: done generating all_blocks data 28983 1726882971.61948: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 28983 1726882971.61950: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 28983 1726882971.61953: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 28983 1726882971.62869: in VariableManager get_vars() 28983 1726882971.62889: done with get_vars() 28983 1726882971.62905: done processing included file 28983 1726882971.62906: iterating over new_blocks loaded from include file 28983 1726882971.62908: in VariableManager get_vars() 28983 1726882971.62920: done with get_vars() 28983 1726882971.62921: filtering new block on tags 28983 1726882971.62940: done filtering new block on tags 28983 1726882971.62943: in VariableManager get_vars() 28983 1726882971.62956: done with get_vars() 28983 1726882971.62957: filtering new block on tags 28983 1726882971.62981: done filtering new block on tags 28983 1726882971.62984: in VariableManager get_vars() 28983 1726882971.62997: done with get_vars() 28983 1726882971.62999: filtering new block on tags 28983 1726882971.63015: done filtering new block on tags 28983 1726882971.63018: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed_node2 28983 1726882971.63025: extending task lists for all hosts with included blocks 28983 1726882971.63097: done extending task lists 28983 1726882971.63098: done processing included files 28983 1726882971.63099: results queue empty 28983 1726882971.63100: checking for any_errors_fatal 28983 1726882971.63102: done checking for any_errors_fatal 28983 1726882971.63103: checking for max_fail_percentage 28983 1726882971.63104: done checking for max_fail_percentage 28983 1726882971.63105: checking to see if all hosts have failed and the running result is not ok 28983 1726882971.63106: done checking to see if all hosts have failed 28983 1726882971.63107: getting the remaining hosts for this loop 28983 1726882971.63108: done getting the remaining hosts for this loop 28983 1726882971.63111: getting the next task for host managed_node2 28983 1726882971.63116: done getting next task for host managed_node2 28983 1726882971.63119: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 28983 1726882971.63121: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882971.63124: getting variables 28983 1726882971.63125: in VariableManager get_vars() 28983 1726882971.63136: Calling all_inventory to load vars for managed_node2 28983 1726882971.63140: Calling groups_inventory to load vars for managed_node2 28983 1726882971.63144: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882971.63150: Calling all_plugins_play to load vars for managed_node2 28983 1726882971.63153: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882971.63157: Calling groups_plugins_play to load vars for managed_node2 28983 1726882971.63410: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882971.63740: done with get_vars() 28983 1726882971.63753: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Friday 20 September 2024 21:42:51 -0400 (0:00:00.041) 0:00:01.636 ****** 28983 1726882971.63829: entering _queue_task() for managed_node2/setup 28983 1726882971.64562: worker is 1 (out of 1 available) 28983 1726882971.64576: exiting _queue_task() for managed_node2/setup 28983 1726882971.64591: done queuing things up, now waiting for results queue to drain 28983 1726882971.64593: waiting for pending results... 28983 1726882971.64849: running TaskExecutor() for managed_node2/TASK: Gather the minimum subset of ansible_facts required by the network role test 28983 1726882971.64972: in run() - task 0affe814-3a2d-b16d-c0a7-00000000002c 28983 1726882971.64994: variable 'ansible_search_path' from source: unknown 28983 1726882971.65001: variable 'ansible_search_path' from source: unknown 28983 1726882971.65048: calling self._execute() 28983 1726882971.65137: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882971.65150: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882971.65188: variable 'omit' from source: magic vars 28983 1726882971.66081: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726882971.69290: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726882971.69382: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726882971.69430: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726882971.69670: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726882971.69716: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726882971.69822: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726882971.69915: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726882971.69919: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726882971.69964: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726882971.69989: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726882971.70213: variable 'ansible_facts' from source: unknown 28983 1726882971.70316: variable 'network_test_required_facts' from source: task vars 28983 1726882971.70370: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 28983 1726882971.70385: variable 'omit' from source: magic vars 28983 1726882971.70439: variable 'omit' from source: magic vars 28983 1726882971.70571: variable 'omit' from source: magic vars 28983 1726882971.70575: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726882971.70724: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726882971.70751: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726882971.70813: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882971.71182: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882971.71186: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726882971.71188: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882971.71190: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882971.71193: Set connection var ansible_connection to ssh 28983 1726882971.71195: Set connection var ansible_shell_executable to /bin/sh 28983 1726882971.71197: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726882971.71199: Set connection var ansible_timeout to 10 28983 1726882971.71202: Set connection var ansible_pipelining to False 28983 1726882971.71204: Set connection var ansible_shell_type to sh 28983 1726882971.71219: variable 'ansible_shell_executable' from source: unknown 28983 1726882971.71227: variable 'ansible_connection' from source: unknown 28983 1726882971.71238: variable 'ansible_module_compression' from source: unknown 28983 1726882971.71247: variable 'ansible_shell_type' from source: unknown 28983 1726882971.71254: variable 'ansible_shell_executable' from source: unknown 28983 1726882971.71262: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882971.71274: variable 'ansible_pipelining' from source: unknown 28983 1726882971.71286: variable 'ansible_timeout' from source: unknown 28983 1726882971.71296: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882971.71489: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28983 1726882971.71506: variable 'omit' from source: magic vars 28983 1726882971.71516: starting attempt loop 28983 1726882971.71523: running the handler 28983 1726882971.71549: _low_level_execute_command(): starting 28983 1726882971.71561: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726882971.72292: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726882971.72356: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882971.72448: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726882971.72474: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882971.72630: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882971.74427: stdout chunk (state=3): >>>/root <<< 28983 1726882971.74631: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882971.74636: stdout chunk (state=3): >>><<< 28983 1726882971.74638: stderr chunk (state=3): >>><<< 28983 1726882971.74791: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726882971.74802: _low_level_execute_command(): starting 28983 1726882971.74806: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882971.7467287-29070-197833217790265 `" && echo ansible-tmp-1726882971.7467287-29070-197833217790265="` echo /root/.ansible/tmp/ansible-tmp-1726882971.7467287-29070-197833217790265 `" ) && sleep 0' 28983 1726882971.75889: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726882971.75892: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726882971.75894: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration <<< 28983 1726882971.75897: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found <<< 28983 1726882971.75900: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882971.76262: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882971.78333: stdout chunk (state=3): >>>ansible-tmp-1726882971.7467287-29070-197833217790265=/root/.ansible/tmp/ansible-tmp-1726882971.7467287-29070-197833217790265 <<< 28983 1726882971.78438: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882971.78744: stderr chunk (state=3): >>><<< 28983 1726882971.78748: stdout chunk (state=3): >>><<< 28983 1726882971.78750: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882971.7467287-29070-197833217790265=/root/.ansible/tmp/ansible-tmp-1726882971.7467287-29070-197833217790265 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726882971.78753: variable 'ansible_module_compression' from source: unknown 28983 1726882971.78755: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 28983 1726882971.78868: variable 'ansible_facts' from source: unknown 28983 1726882971.79303: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882971.7467287-29070-197833217790265/AnsiballZ_setup.py 28983 1726882971.79743: Sending initial data 28983 1726882971.79754: Sent initial data (154 bytes) 28983 1726882971.80952: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882971.81158: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726882971.81179: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726882971.81202: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882971.81309: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882971.83038: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 28983 1726882971.83055: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726882971.83104: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726882971.83247: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpr9qfd_3q /root/.ansible/tmp/ansible-tmp-1726882971.7467287-29070-197833217790265/AnsiballZ_setup.py <<< 28983 1726882971.83257: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882971.7467287-29070-197833217790265/AnsiballZ_setup.py" <<< 28983 1726882971.83291: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpr9qfd_3q" to remote "/root/.ansible/tmp/ansible-tmp-1726882971.7467287-29070-197833217790265/AnsiballZ_setup.py" <<< 28983 1726882971.83358: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882971.7467287-29070-197833217790265/AnsiballZ_setup.py" <<< 28983 1726882971.87962: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882971.88160: stderr chunk (state=3): >>><<< 28983 1726882971.88166: stdout chunk (state=3): >>><<< 28983 1726882971.88169: done transferring module to remote 28983 1726882971.88173: _low_level_execute_command(): starting 28983 1726882971.88176: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882971.7467287-29070-197833217790265/ /root/.ansible/tmp/ansible-tmp-1726882971.7467287-29070-197833217790265/AnsiballZ_setup.py && sleep 0' 28983 1726882971.89361: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726882971.89368: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726882971.89371: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882971.89373: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882971.89376: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882971.89573: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882971.89769: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882971.91721: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882971.91789: stderr chunk (state=3): >>><<< 28983 1726882971.91799: stdout chunk (state=3): >>><<< 28983 1726882971.91823: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726882971.91833: _low_level_execute_command(): starting 28983 1726882971.91846: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882971.7467287-29070-197833217790265/AnsiballZ_setup.py && sleep 0' 28983 1726882971.93043: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726882971.93046: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726882971.93049: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882971.93051: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882971.93053: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882971.93264: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726882971.93451: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882971.95752: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 28983 1726882971.95776: stdout chunk (state=3): >>>import _imp # builtin <<< 28983 1726882971.95817: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 28983 1726882971.95951: stdout chunk (state=3): >>>import '_io' # import 'marshal' # import 'posix' # <<< 28983 1726882971.95977: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 28983 1726882971.96001: stdout chunk (state=3): >>>import 'time' # <<< 28983 1726882971.96015: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 28983 1726882971.96255: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf31b4530> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf3183b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf31b6ab0> <<< 28983 1726882971.96272: stdout chunk (state=3): >>>import '_signal' # <<< 28983 1726882971.96300: stdout chunk (state=3): >>>import '_abc' # <<< 28983 1726882971.96304: stdout chunk (state=3): >>>import 'abc' # <<< 28983 1726882971.96321: stdout chunk (state=3): >>>import 'io' # <<< 28983 1726882971.96359: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 28983 1726882971.96452: stdout chunk (state=3): >>>import '_collections_abc' # <<< 28983 1726882971.96486: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 28983 1726882971.96552: stdout chunk (state=3): >>>import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages <<< 28983 1726882971.96570: stdout chunk (state=3): >>>Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' <<< 28983 1726882971.96586: stdout chunk (state=3): >>>Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 28983 1726882971.96611: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py <<< 28983 1726882971.96851: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2f65160> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2f65fd0> import 'site' # Python 3.12.5 (main, Aug 7 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 28983 1726882971.97195: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 28983 1726882971.97210: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 28983 1726882971.97227: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 28983 1726882971.97241: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 28983 1726882971.97263: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 28983 1726882971.97357: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 28983 1726882971.97370: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2fa3e90> <<< 28983 1726882971.97390: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 28983 1726882971.97663: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2fa3f50> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2fdb8c0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' <<< 28983 1726882971.97666: stdout chunk (state=3): >>>import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2fdbf50> <<< 28983 1726882971.97681: stdout chunk (state=3): >>>import '_collections' # <<< 28983 1726882971.97741: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2fbbb60> <<< 28983 1726882971.97744: stdout chunk (state=3): >>>import '_functools' # <<< 28983 1726882971.97774: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2fb9280> <<< 28983 1726882971.97947: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2fa1040> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # <<< 28983 1726882971.97961: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 28983 1726882971.97989: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 28983 1726882971.98004: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py <<< 28983 1726882971.98021: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 28983 1726882971.98051: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2fff800> <<< 28983 1726882971.98068: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2ffe420> <<< 28983 1726882971.98260: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2fba150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2ffccb0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf3030890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2fa02c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2cf3030d40> <<< 28983 1726882971.98265: stdout chunk (state=3): >>>import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf3030bf0> <<< 28983 1726882971.98291: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 28983 1726882971.98309: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2cf3030fb0> <<< 28983 1726882971.98321: stdout chunk (state=3): >>>import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2f9ede0> <<< 28983 1726882971.98352: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 28983 1726882971.98370: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 28983 1726882971.98407: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 28983 1726882971.98428: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf3031670> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf3031340> import 'importlib.machinery' # <<< 28983 1726882971.98651: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf3032510> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' <<< 28983 1726882971.98681: stdout chunk (state=3): >>>import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf3048740> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2cf3049e20> <<< 28983 1726882971.98707: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py <<< 28983 1726882971.98710: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 28983 1726882971.98728: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py <<< 28983 1726882971.98741: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 28983 1726882971.98752: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf304acf0> <<< 28983 1726882971.98794: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2cf304b350> <<< 28983 1726882971.98807: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf304a270> <<< 28983 1726882971.98824: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py <<< 28983 1726882971.98839: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 28983 1726882971.98874: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 28983 1726882971.99051: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2cf304bdd0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf304b500> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf3032570> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 28983 1726882971.99076: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2cf2d53c80> <<< 28983 1726882971.99113: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 28983 1726882971.99143: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2cf2d7c740> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2d7c4a0> <<< 28983 1726882971.99167: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' <<< 28983 1726882971.99243: stdout chunk (state=3): >>># extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2cf2d7c770> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2cf2d7c950> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2d51e20> <<< 28983 1726882971.99256: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 28983 1726882971.99361: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 28983 1726882971.99455: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2d7df70> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2d7cbf0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf3032c60> <<< 28983 1726882971.99559: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 28983 1726882971.99570: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 28983 1726882971.99598: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 28983 1726882971.99682: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2dae300> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 28983 1726882971.99700: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 28983 1726882971.99716: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 28983 1726882971.99796: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 28983 1726882971.99804: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2dc6480> <<< 28983 1726882971.99847: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 28983 1726882971.99861: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 28983 1726882971.99918: stdout chunk (state=3): >>>import 'ntpath' # <<< 28983 1726882972.00160: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2e03230> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 28983 1726882972.00174: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2e259d0> <<< 28983 1726882972.00251: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2e03350> <<< 28983 1726882972.00291: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2dc7110> <<< 28983 1726882972.00317: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py <<< 28983 1726882972.00385: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2c40350> <<< 28983 1726882972.00417: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2dc54c0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2d7eea0> <<< 28983 1726882972.00516: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 28983 1726882972.00540: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f2cf2c40530> <<< 28983 1726882972.00717: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_ll4go3kt/ansible_setup_payload.zip' <<< 28983 1726882972.00732: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.00877: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.00906: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 28983 1726882972.00921: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 28983 1726882972.00964: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 28983 1726882972.01047: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 28983 1726882972.01079: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2ca9fa0> <<< 28983 1726882972.01097: stdout chunk (state=3): >>>import '_typing' # <<< 28983 1726882972.01289: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2c80e90> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2c43f50> <<< 28983 1726882972.01317: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.01349: stdout chunk (state=3): >>>import 'ansible' # # zipimport: zlib available <<< 28983 1726882972.01377: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 28983 1726882972.01403: stdout chunk (state=3): >>>import 'ansible.module_utils' # # zipimport: zlib available <<< 28983 1726882972.03078: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.04329: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2c83e30> <<< 28983 1726882972.04381: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 28983 1726882972.04415: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py <<< 28983 1726882972.04419: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 28983 1726882972.04421: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 28983 1726882972.04528: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2cf2cd9970> <<< 28983 1726882972.04531: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2cd9700> <<< 28983 1726882972.04586: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2cd9010> <<< 28983 1726882972.04590: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 28983 1726882972.04665: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2cd9460> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2caac30> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2cf2cda720> <<< 28983 1726882972.04685: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2cf2cda960> <<< 28983 1726882972.04700: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 28983 1726882972.04826: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 28983 1726882972.04830: stdout chunk (state=3): >>>import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2cdaea0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 28983 1726882972.04856: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 28983 1726882972.04894: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2b40b90> <<< 28983 1726882972.04926: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2cf2b427b0> <<< 28983 1726882972.04951: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 28983 1726882972.05021: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 28983 1726882972.05069: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2b430e0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 28983 1726882972.05157: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2b43f20> <<< 28983 1726882972.05161: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 28983 1726882972.05172: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 28983 1726882972.05222: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2b46d80> <<< 28983 1726882972.05267: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2cf2b470b0> <<< 28983 1726882972.05311: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2b45040> <<< 28983 1726882972.05382: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 28983 1726882972.05386: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 28983 1726882972.05480: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py <<< 28983 1726882972.05498: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2b4ae10> import '_tokenize' # <<< 28983 1726882972.05547: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2b498e0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2b49670> <<< 28983 1726882972.05597: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 28983 1726882972.05807: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2b4bc80> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2b45550> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2cf2b8efc0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2b8f1a0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 28983 1726882972.05811: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 28983 1726882972.05814: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 28983 1726882972.05877: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2cf2b90c20> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2b909e0> <<< 28983 1726882972.05887: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 28983 1726882972.06059: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 28983 1726882972.06063: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2cf2b931a0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2b912e0> <<< 28983 1726882972.06090: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 28983 1726882972.06170: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 28983 1726882972.06200: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 28983 1726882972.06300: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2b9e960> <<< 28983 1726882972.06395: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2b932f0> <<< 28983 1726882972.06476: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2cf2b9fbf0> <<< 28983 1726882972.06513: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2cf2b9f9e0> <<< 28983 1726882972.06650: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2cf2b9fd40> <<< 28983 1726882972.06667: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2b8f380> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 28983 1726882972.06695: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 28983 1726882972.06726: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2cf2ba34d0> <<< 28983 1726882972.06930: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 28983 1726882972.06935: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2cf2ba4770> <<< 28983 1726882972.06983: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2ba1c40> <<< 28983 1726882972.06990: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' <<< 28983 1726882972.06994: stdout chunk (state=3): >>># extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2cf2ba2fc0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2ba1820> <<< 28983 1726882972.07008: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.07043: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 28983 1726882972.07158: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.07266: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.07293: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available <<< 28983 1726882972.07330: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.text' # <<< 28983 1726882972.07339: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.07489: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.07628: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.08421: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.09159: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # <<< 28983 1726882972.09162: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2cf2a2c6e0> <<< 28983 1726882972.09258: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 28983 1726882972.09282: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2a2d490> <<< 28983 1726882972.09307: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2ba0980> <<< 28983 1726882972.09348: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 28983 1726882972.09466: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.09484: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils._text' # <<< 28983 1726882972.09487: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.09589: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.09797: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 28983 1726882972.09805: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2a2d4f0> <<< 28983 1726882972.09817: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.10371: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.10925: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.11013: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.11111: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 28983 1726882972.11122: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.11169: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.11199: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 28983 1726882972.11219: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.11302: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.11418: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 28983 1726882972.11428: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.11454: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing' # <<< 28983 1726882972.11472: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.11514: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.11563: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 28983 1726882972.11571: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.11857: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.12155: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 28983 1726882972.12236: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # <<< 28983 1726882972.12355: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2a2f950> <<< 28983 1726882972.12371: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.12430: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.12525: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 28983 1726882972.12572: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # <<< 28983 1726882972.12577: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 28983 1726882972.12659: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 28983 1726882972.12798: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2cf2a36000> <<< 28983 1726882972.12870: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2cf2a36900> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2a2ea20> <<< 28983 1726882972.12873: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.12923: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.12973: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 28983 1726882972.12986: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.13023: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.13074: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.13136: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.13211: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 28983 1726882972.13255: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 28983 1726882972.13351: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2cf2a35550> <<< 28983 1726882972.13399: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2a36b10> <<< 28983 1726882972.13437: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # <<< 28983 1726882972.13441: stdout chunk (state=3): >>>import 'ansible.module_utils.common.process' # <<< 28983 1726882972.13446: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.13516: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.13583: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.13617: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.13660: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py <<< 28983 1726882972.13665: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 28983 1726882972.13683: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 28983 1726882972.13709: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 28983 1726882972.13727: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 28983 1726882972.13796: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 28983 1726882972.13809: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 28983 1726882972.13829: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 28983 1726882972.13892: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2ac6c90> <<< 28983 1726882972.13943: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2a439e0> <<< 28983 1726882972.14029: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2a3ab40> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2a3a900> # destroy ansible.module_utils.distro <<< 28983 1726882972.14049: stdout chunk (state=3): >>>import 'ansible.module_utils.distro' # <<< 28983 1726882972.14056: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.14077: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.14110: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 28983 1726882972.14180: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 28983 1726882972.14188: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.14204: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.modules' # <<< 28983 1726882972.14223: stdout chunk (state=3): >>># zipimport: zlib available<<< 28983 1726882972.14229: stdout chunk (state=3): >>> <<< 28983 1726882972.14292: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.14360: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.14377: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.14403: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.14447: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.14500: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.14533: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.14582: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # <<< 28983 1726882972.14589: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.14674: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.14751: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.14781: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.14817: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # <<< 28983 1726882972.14827: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.15026: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.15239: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.15271: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.15341: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 28983 1726882972.15373: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py <<< 28983 1726882972.15404: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 28983 1726882972.15440: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 28983 1726882972.15451: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2acda90> <<< 28983 1726882972.15491: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py <<< 28983 1726882972.15494: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' <<< 28983 1726882972.15505: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 28983 1726882972.15568: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 28983 1726882972.15595: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py <<< 28983 1726882972.15638: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf1f441a0> <<< 28983 1726882972.15658: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2cf1f444d0> <<< 28983 1726882972.15798: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2ab5250> <<< 28983 1726882972.15801: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2ab4110> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2acc200> <<< 28983 1726882972.15807: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2acfa40> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 28983 1726882972.15896: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 28983 1726882972.15900: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 28983 1726882972.15926: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 28983 1726882972.15981: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2cf1f47410> <<< 28983 1726882972.16021: stdout chunk (state=3): >>>import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf1f46cc0> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2cf1f46ea0> <<< 28983 1726882972.16025: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf1f46150> <<< 28983 1726882972.16051: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 28983 1726882972.16160: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' <<< 28983 1726882972.16187: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf1f474d0> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 28983 1726882972.16216: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 28983 1726882972.16257: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2cf1fb1fa0> <<< 28983 1726882972.16289: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf1f47f80> <<< 28983 1726882972.16319: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2acfd40> import 'ansible.module_utils.facts.timeout' # <<< 28983 1726882972.16367: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.collector' # # zipimport: zlib available <<< 28983 1726882972.16373: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.16437: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other' # # zipimport: zlib available <<< 28983 1726882972.16456: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.16522: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 28983 1726882972.16531: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.16601: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.16646: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 28983 1726882972.16669: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.16681: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system' # <<< 28983 1726882972.16692: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.16737: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.16764: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # <<< 28983 1726882972.16772: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.16828: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.16887: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # <<< 28983 1726882972.16891: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.16942: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.16977: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # <<< 28983 1726882972.16996: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.17058: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.17120: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.17186: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.17251: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # <<< 28983 1726882972.17269: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.17816: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.18325: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available <<< 28983 1726882972.18382: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.18446: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.18475: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.18543: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # <<< 28983 1726882972.18558: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.18567: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.18601: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # <<< 28983 1726882972.18610: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.18674: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.18731: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 28983 1726882972.18762: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.18795: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.18824: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # <<< 28983 1726882972.18829: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.18862: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.18893: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # <<< 28983 1726882972.18910: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.18993: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.19096: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py <<< 28983 1726882972.19102: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 28983 1726882972.19132: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf1fb38c0> <<< 28983 1726882972.19150: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 28983 1726882972.19183: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 28983 1726882972.19317: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf1fb2ba0> <<< 28983 1726882972.19320: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.local' # <<< 28983 1726882972.19341: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.19405: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.19496: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 28983 1726882972.19500: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.19589: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.19702: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 28983 1726882972.19708: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.19770: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.19861: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # <<< 28983 1726882972.19877: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.19908: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.19963: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 28983 1726882972.20017: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 28983 1726882972.20103: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 28983 1726882972.20165: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2cf1fe6240> <<< 28983 1726882972.20373: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf1fd2f00> import 'ansible.module_utils.facts.system.python' # <<< 28983 1726882972.20387: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.20449: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.20519: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # <<< 28983 1726882972.20523: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.20619: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.20710: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.20844: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.21030: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available <<< 28983 1726882972.21079: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.21108: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # <<< 28983 1726882972.21122: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.21160: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.21236: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 28983 1726882972.21289: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 28983 1726882972.21329: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2cf1ffdb20> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf1ffda90> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # <<< 28983 1726882972.21346: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.21385: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.21427: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # <<< 28983 1726882972.21449: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.21620: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.21784: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 28983 1726882972.21794: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.21906: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.22018: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.22066: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.22107: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # <<< 28983 1726882972.22129: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.22147: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.22175: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.22332: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.22493: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # <<< 28983 1726882972.22511: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.22647: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.22787: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 28983 1726882972.22794: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.22830: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.22866: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.23511: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.24093: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # <<< 28983 1726882972.24116: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available <<< 28983 1726882972.24231: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.24362: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 28983 1726882972.24365: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.24470: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.24590: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 28983 1726882972.24592: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.24758: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.24928: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 28983 1726882972.24946: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.24959: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network' # <<< 28983 1726882972.24982: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.25023: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.25076: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # <<< 28983 1726882972.25084: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.25194: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.25303: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.25535: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.25767: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # <<< 28983 1726882972.25774: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.aix' # <<< 28983 1726882972.25782: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.25828: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.25857: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # <<< 28983 1726882972.25877: stdout chunk (state=3): >>># zipimport: zlib available<<< 28983 1726882972.25887: stdout chunk (state=3): >>> <<< 28983 1726882972.25903: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.25922: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # <<< 28983 1726882972.25947: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.26017: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.26100: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # <<< 28983 1726882972.26106: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.26128: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.26162: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available <<< 28983 1726882972.26224: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.26290: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # <<< 28983 1726882972.26298: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.26355: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.26417: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 28983 1726882972.26422: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.26719: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.27011: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 28983 1726882972.27020: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.27077: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.27143: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 28983 1726882972.27150: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.27194: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.27222: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # <<< 28983 1726882972.27238: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.27266: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.27304: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # <<< 28983 1726882972.27310: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.27352: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.27376: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # <<< 28983 1726882972.27397: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.27475: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.27563: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 28983 1726882972.27574: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.27589: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.27598: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual' # <<< 28983 1726882972.27615: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.27655: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.27722: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # <<< 28983 1726882972.27726: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.27748: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 28983 1726882972.27803: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.27854: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.27928: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.28007: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # <<< 28983 1726882972.28023: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # <<< 28983 1726882972.28028: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.28082: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.28136: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # <<< 28983 1726882972.28143: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.28362: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.28577: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 28983 1726882972.28596: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.28637: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.28687: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # <<< 28983 1726882972.28694: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.28752: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.28798: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # <<< 28983 1726882972.28805: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.28892: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.28983: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # <<< 28983 1726882972.28997: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available <<< 28983 1726882972.29094: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.29219: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 28983 1726882972.29293: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.30279: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py <<< 28983 1726882972.30288: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 28983 1726882972.30311: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py <<< 28983 1726882972.30327: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 28983 1726882972.30374: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2cf1e2b5c0> <<< 28983 1726882972.30387: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf1e2aae0> <<< 28983 1726882972.30445: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf1e29340> <<< 28983 1726882972.30800: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_lsb": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_local": {}, "ansible_fips": false, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "42", "second": "52", "epoch": "1726882972", "epoch_int": "1726882972", "date": "2024-09-20", "time": "21:42:52", "iso8601_micro": "2024-09-21T01:42:52.301070Z", "iso8601": "2024-09-21T01:42:52Z", "iso8601_basic": "20240920T214252301070", "iso8601_basic_short": "20240920T214252", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-100.fc39.x86_64", "root": "UUID=f92a5a40-e33d-4a6f-8746-997eff27cfbd", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ans<<< 28983 1726882972.30823: stdout chunk (state=3): >>>ible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-100.fc39.x86_64", "root": "UUID=f92a5a40-e33d-4a6f-8746-997eff27cfbd", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_pkg_mgr": "dnf", "ansible_apparmor": {"status": "disabled"}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAMAGmz4coceQATe4wVCPswNHKDq397sHN0wP3lxcDx5YXYj3mfO2mlh1Dpa7TCdRgLKwtozxXO6KafFlS3d0x9UWzSyKR0sSc77mhua/o3Y8EThq+wmVYqwwMQp1Vh8aBTvONV8N1UHqLp3aOdJIjHMGYdoUzUEF7xedcrV0fOV9AAAAFQDiC9S/VmOYdv/C8sXiFstIvsP/FQAAAIEApapvkLljxqN9GCi5UqXohiznCnndWFY9Vt/4wN+GtUjkuNJBqYHErEZCKfujpgVR94wM4sP3DbiJkL+OurGNHPJn7qrXDGQNIKExN7q3EzJI6yKBYdq1pnuhK1fBE/B8I/GQAEoqP3PMoutNlf85wWVgmt1DBc+D9D87BEGZzFoAAACBAIyk6Zb39dUz0T2fpmnSTF7AJHxsuBXwGZH1/5c5tWS0QGhwu5nzEoJUkQLhk+JqFJVRjNKoZ8wzH8N32ZrE15HfLF6/uIlfBorDH5AhDSnVumVmGZtYAerr8Cch5xqDXZSHTUhi7nBmdY/IKTgk7lCs0q4c7ja/wOueEHXkfWdF", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDjr2n9pN3Skd0p3fTYngf1LkuFARyHo/RDL4n6yhKcuFMhqe9MjqKkyMOvWaeNvTMkAiQ7TNEROz2mBqoPSot74UYaDR1nw9xQRq7skd9l/L6FlWrbg6EBCcQZUgkgjucBgmk3+INE0QVUdywVyW2IdmqayH/fLojViFulCLWlWn9cjFclC/t+sfMfoY6DrRIoi1GlfdlEfEHT9zGqC5syJrp6Yb3b9Ho/CYNAXya6aAHzMTLkx0/kU4czCptGZ9ew7HWLOtMv8iahxGrAp1VW4jj76+SZ5OisJ9N7+g8GPnaNAvsDNldGNQJWME6YNcEbxHblmHAEU0lq9EydM2W5iQHUnSezOSqQBljsiUACwwxZSphsqFYQnsHv4Vl/NlVTAJOApkU0VehWPUtOQNiqG+W/VGFMqqBksxF5tVDTO+qkvF5bm8JT2RSHAIbRpPPOYA8fg1PEPu1ONXD99Jn3urd0Y0kvUfp2NzPk1JFbxcGh+uDNHR2t5bVuOyRvmy0=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKj0SRB0MBwzo3XKoUBfi6MCOa8n8Z6sjosikvEKYLTWy/hzFaSt2hhtv0qoPi/CAERuCNgGQ5pZPiqBpnr9C8A=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIHVmW9cqR4t5U02ebXgqIiDjJ0aeuxmuwOiTXv538jBQ", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "6.10.9-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 9 02:28:01 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-46-139.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-46-139", "ansible_nodename": "ip-10-31-46-139.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2c52174af731fc996c81a6a9338a65", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.145 47942 10.31.46.139 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.145 47942 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 28983 1726882972.31418: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 28983 1726882972.31427: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value <<< 28983 1726882972.31438: stdout chunk (state=3): >>># clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread <<< 28983 1726882972.31465: stdout chunk (state=3): >>># cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal <<< 28983 1726882972.31469: stdout chunk (state=3): >>># cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword <<< 28983 1726882972.31495: stdout chunk (state=3): >>># cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma <<< 28983 1726882972.31507: stdout chunk (state=3): >>># cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset <<< 28983 1726882972.31525: stdout chunk (state=3): >>># cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path <<< 28983 1726882972.31533: stdout chunk (state=3): >>># cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors <<< 28983 1726882972.31566: stdout chunk (state=3): >>># cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string <<< 28983 1726882972.31575: stdout chunk (state=3): >>># cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six <<< 28983 1726882972.31607: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast <<< 28983 1726882972.31611: stdout chunk (state=3): >>># cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec <<< 28983 1726882972.31636: stdout chunk (state=3): >>># cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 <<< 28983 1726882972.31648: stdout chunk (state=3): >>># cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process <<< 28983 1726882972.31664: stdout chunk (state=3): >>># destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq <<< 28983 1726882972.31684: stdout chunk (state=3): >>># cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system <<< 28983 1726882972.31701: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local<<< 28983 1726882972.31712: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly <<< 28983 1726882972.31719: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd <<< 28983 1726882972.31755: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env<<< 28983 1726882972.31758: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python <<< 28983 1726882972.31780: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd <<< 28983 1726882972.31793: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna <<< 28983 1726882972.32365: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid <<< 28983 1726882972.32432: stdout chunk (state=3): >>># destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil <<< 28983 1726882972.32437: stdout chunk (state=3): >>># destroy distro # destroy distro.distro # destroy argparse # destroy logging <<< 28983 1726882972.32483: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal <<< 28983 1726882972.32566: stdout chunk (state=3): >>># destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue <<< 28983 1726882972.32598: stdout chunk (state=3): >>># destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 <<< 28983 1726882972.32820: stdout chunk (state=3): >>># destroy _ssl <<< 28983 1726882972.32827: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform <<< 28983 1726882972.32929: stdout chunk (state=3): >>># cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg <<< 28983 1726882972.32947: stdout chunk (state=3): >>># cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types <<< 28983 1726882972.32987: stdout chunk (state=3): >>># cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs <<< 28983 1726882972.33000: stdout chunk (state=3): >>># cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 28983 1726882972.33293: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 28983 1726882972.33296: stdout chunk (state=3): >>># destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves <<< 28983 1726882972.33323: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 28983 1726882972.33339: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 28983 1726882972.33426: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading <<< 28983 1726882972.33457: stdout chunk (state=3): >>># destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time <<< 28983 1726882972.33664: stdout chunk (state=3): >>># destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 28983 1726882972.33983: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882972.34050: stderr chunk (state=3): >>>Shared connection to 10.31.46.139 closed. <<< 28983 1726882972.34091: stderr chunk (state=3): >>><<< 28983 1726882972.34256: stdout chunk (state=3): >>><<< 28983 1726882972.34541: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf31b4530> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf3183b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf31b6ab0> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2f65160> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2f65fd0> import 'site' # Python 3.12.5 (main, Aug 7 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2fa3e90> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2fa3f50> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2fdb8c0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2fdbf50> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2fbbb60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2fb9280> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2fa1040> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2fff800> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2ffe420> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2fba150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2ffccb0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf3030890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2fa02c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2cf3030d40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf3030bf0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2cf3030fb0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2f9ede0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf3031670> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf3031340> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf3032510> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf3048740> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2cf3049e20> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf304acf0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2cf304b350> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf304a270> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2cf304bdd0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf304b500> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf3032570> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2cf2d53c80> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2cf2d7c740> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2d7c4a0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2cf2d7c770> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2cf2d7c950> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2d51e20> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2d7df70> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2d7cbf0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf3032c60> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2dae300> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2dc6480> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2e03230> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2e259d0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2e03350> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2dc7110> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2c40350> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2dc54c0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2d7eea0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f2cf2c40530> # zipimport: found 103 names in '/tmp/ansible_setup_payload_ll4go3kt/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2ca9fa0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2c80e90> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2c43f50> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2c83e30> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2cf2cd9970> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2cd9700> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2cd9010> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2cd9460> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2caac30> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2cf2cda720> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2cf2cda960> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2cdaea0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2b40b90> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2cf2b427b0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2b430e0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2b43f20> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2b46d80> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2cf2b470b0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2b45040> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2b4ae10> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2b498e0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2b49670> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2b4bc80> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2b45550> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2cf2b8efc0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2b8f1a0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2cf2b90c20> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2b909e0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2cf2b931a0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2b912e0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2b9e960> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2b932f0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2cf2b9fbf0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2cf2b9f9e0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2cf2b9fd40> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2b8f380> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2cf2ba34d0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2cf2ba4770> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2ba1c40> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2cf2ba2fc0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2ba1820> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2cf2a2c6e0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2a2d490> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2ba0980> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2a2d4f0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2a2f950> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2cf2a36000> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2cf2a36900> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2a2ea20> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2cf2a35550> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2a36b10> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2ac6c90> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2a439e0> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2a3ab40> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2a3a900> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2acda90> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf1f441a0> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2cf1f444d0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2ab5250> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2ab4110> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2acc200> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2acfa40> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2cf1f47410> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf1f46cc0> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2cf1f46ea0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf1f46150> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf1f474d0> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2cf1fb1fa0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf1f47f80> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf2acfd40> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf1fb38c0> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf1fb2ba0> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2cf1fe6240> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf1fd2f00> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2cf1ffdb20> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf1ffda90> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2cf1e2b5c0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf1e2aae0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2cf1e29340> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_lsb": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_local": {}, "ansible_fips": false, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "42", "second": "52", "epoch": "1726882972", "epoch_int": "1726882972", "date": "2024-09-20", "time": "21:42:52", "iso8601_micro": "2024-09-21T01:42:52.301070Z", "iso8601": "2024-09-21T01:42:52Z", "iso8601_basic": "20240920T214252301070", "iso8601_basic_short": "20240920T214252", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-100.fc39.x86_64", "root": "UUID=f92a5a40-e33d-4a6f-8746-997eff27cfbd", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-100.fc39.x86_64", "root": "UUID=f92a5a40-e33d-4a6f-8746-997eff27cfbd", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_pkg_mgr": "dnf", "ansible_apparmor": {"status": "disabled"}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAMAGmz4coceQATe4wVCPswNHKDq397sHN0wP3lxcDx5YXYj3mfO2mlh1Dpa7TCdRgLKwtozxXO6KafFlS3d0x9UWzSyKR0sSc77mhua/o3Y8EThq+wmVYqwwMQp1Vh8aBTvONV8N1UHqLp3aOdJIjHMGYdoUzUEF7xedcrV0fOV9AAAAFQDiC9S/VmOYdv/C8sXiFstIvsP/FQAAAIEApapvkLljxqN9GCi5UqXohiznCnndWFY9Vt/4wN+GtUjkuNJBqYHErEZCKfujpgVR94wM4sP3DbiJkL+OurGNHPJn7qrXDGQNIKExN7q3EzJI6yKBYdq1pnuhK1fBE/B8I/GQAEoqP3PMoutNlf85wWVgmt1DBc+D9D87BEGZzFoAAACBAIyk6Zb39dUz0T2fpmnSTF7AJHxsuBXwGZH1/5c5tWS0QGhwu5nzEoJUkQLhk+JqFJVRjNKoZ8wzH8N32ZrE15HfLF6/uIlfBorDH5AhDSnVumVmGZtYAerr8Cch5xqDXZSHTUhi7nBmdY/IKTgk7lCs0q4c7ja/wOueEHXkfWdF", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDjr2n9pN3Skd0p3fTYngf1LkuFARyHo/RDL4n6yhKcuFMhqe9MjqKkyMOvWaeNvTMkAiQ7TNEROz2mBqoPSot74UYaDR1nw9xQRq7skd9l/L6FlWrbg6EBCcQZUgkgjucBgmk3+INE0QVUdywVyW2IdmqayH/fLojViFulCLWlWn9cjFclC/t+sfMfoY6DrRIoi1GlfdlEfEHT9zGqC5syJrp6Yb3b9Ho/CYNAXya6aAHzMTLkx0/kU4czCptGZ9ew7HWLOtMv8iahxGrAp1VW4jj76+SZ5OisJ9N7+g8GPnaNAvsDNldGNQJWME6YNcEbxHblmHAEU0lq9EydM2W5iQHUnSezOSqQBljsiUACwwxZSphsqFYQnsHv4Vl/NlVTAJOApkU0VehWPUtOQNiqG+W/VGFMqqBksxF5tVDTO+qkvF5bm8JT2RSHAIbRpPPOYA8fg1PEPu1ONXD99Jn3urd0Y0kvUfp2NzPk1JFbxcGh+uDNHR2t5bVuOyRvmy0=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKj0SRB0MBwzo3XKoUBfi6MCOa8n8Z6sjosikvEKYLTWy/hzFaSt2hhtv0qoPi/CAERuCNgGQ5pZPiqBpnr9C8A=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIHVmW9cqR4t5U02ebXgqIiDjJ0aeuxmuwOiTXv538jBQ", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "6.10.9-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 9 02:28:01 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-46-139.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-46-139", "ansible_nodename": "ip-10-31-46-139.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2c52174af731fc996c81a6a9338a65", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.145 47942 10.31.46.139 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.145 47942 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 28983 1726882972.36861: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882971.7467287-29070-197833217790265/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726882972.36876: _low_level_execute_command(): starting 28983 1726882972.36882: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882971.7467287-29070-197833217790265/ > /dev/null 2>&1 && sleep 0' 28983 1726882972.36885: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726882972.37021: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882972.37024: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726882972.37026: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 28983 1726882972.37028: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882972.37031: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882972.37222: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726882972.37226: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726882972.37264: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882972.37339: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882972.39322: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882972.39394: stderr chunk (state=3): >>><<< 28983 1726882972.39397: stdout chunk (state=3): >>><<< 28983 1726882972.39417: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726882972.39424: handler run complete 28983 1726882972.39487: variable 'ansible_facts' from source: unknown 28983 1726882972.39555: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882972.39730: variable 'ansible_facts' from source: unknown 28983 1726882972.39793: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882972.39875: attempt loop complete, returning result 28983 1726882972.39881: _execute() done 28983 1726882972.39884: dumping result to json 28983 1726882972.39897: done dumping result, returning 28983 1726882972.39907: done running TaskExecutor() for managed_node2/TASK: Gather the minimum subset of ansible_facts required by the network role test [0affe814-3a2d-b16d-c0a7-00000000002c] 28983 1726882972.39912: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000002c 28983 1726882972.40107: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000002c 28983 1726882972.40109: WORKER PROCESS EXITING ok: [managed_node2] 28983 1726882972.40351: no more pending results, returning what we have 28983 1726882972.40354: results queue empty 28983 1726882972.40355: checking for any_errors_fatal 28983 1726882972.40357: done checking for any_errors_fatal 28983 1726882972.40358: checking for max_fail_percentage 28983 1726882972.40359: done checking for max_fail_percentage 28983 1726882972.40360: checking to see if all hosts have failed and the running result is not ok 28983 1726882972.40361: done checking to see if all hosts have failed 28983 1726882972.40362: getting the remaining hosts for this loop 28983 1726882972.40363: done getting the remaining hosts for this loop 28983 1726882972.40367: getting the next task for host managed_node2 28983 1726882972.40375: done getting next task for host managed_node2 28983 1726882972.40377: ^ task is: TASK: Check if system is ostree 28983 1726882972.40382: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882972.40386: getting variables 28983 1726882972.40387: in VariableManager get_vars() 28983 1726882972.40421: Calling all_inventory to load vars for managed_node2 28983 1726882972.40425: Calling groups_inventory to load vars for managed_node2 28983 1726882972.40429: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882972.40441: Calling all_plugins_play to load vars for managed_node2 28983 1726882972.40445: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882972.40449: Calling groups_plugins_play to load vars for managed_node2 28983 1726882972.40715: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882972.41054: done with get_vars() 28983 1726882972.41069: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Friday 20 September 2024 21:42:52 -0400 (0:00:00.773) 0:00:02.409 ****** 28983 1726882972.41184: entering _queue_task() for managed_node2/stat 28983 1726882972.41551: worker is 1 (out of 1 available) 28983 1726882972.41563: exiting _queue_task() for managed_node2/stat 28983 1726882972.41575: done queuing things up, now waiting for results queue to drain 28983 1726882972.41577: waiting for pending results... 28983 1726882972.41853: running TaskExecutor() for managed_node2/TASK: Check if system is ostree 28983 1726882972.41858: in run() - task 0affe814-3a2d-b16d-c0a7-00000000002e 28983 1726882972.41873: variable 'ansible_search_path' from source: unknown 28983 1726882972.41882: variable 'ansible_search_path' from source: unknown 28983 1726882972.41923: calling self._execute() 28983 1726882972.42016: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882972.42028: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882972.42062: variable 'omit' from source: magic vars 28983 1726882972.42609: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726882972.42934: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726882972.42990: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726882972.43038: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726882972.43153: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726882972.43188: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726882972.43221: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726882972.43268: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726882972.43311: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726882972.43460: Evaluated conditional (not __network_is_ostree is defined): True 28983 1726882972.43476: variable 'omit' from source: magic vars 28983 1726882972.43525: variable 'omit' from source: magic vars 28983 1726882972.43572: variable 'omit' from source: magic vars 28983 1726882972.43613: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726882972.43653: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726882972.43695: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726882972.43711: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882972.43740: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882972.43766: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726882972.43776: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882972.43803: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882972.43914: Set connection var ansible_connection to ssh 28983 1726882972.43938: Set connection var ansible_shell_executable to /bin/sh 28983 1726882972.44022: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726882972.44025: Set connection var ansible_timeout to 10 28983 1726882972.44029: Set connection var ansible_pipelining to False 28983 1726882972.44031: Set connection var ansible_shell_type to sh 28983 1726882972.44034: variable 'ansible_shell_executable' from source: unknown 28983 1726882972.44036: variable 'ansible_connection' from source: unknown 28983 1726882972.44040: variable 'ansible_module_compression' from source: unknown 28983 1726882972.44042: variable 'ansible_shell_type' from source: unknown 28983 1726882972.44044: variable 'ansible_shell_executable' from source: unknown 28983 1726882972.44046: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882972.44131: variable 'ansible_pipelining' from source: unknown 28983 1726882972.44136: variable 'ansible_timeout' from source: unknown 28983 1726882972.44139: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882972.44255: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28983 1726882972.44276: variable 'omit' from source: magic vars 28983 1726882972.44286: starting attempt loop 28983 1726882972.44294: running the handler 28983 1726882972.44311: _low_level_execute_command(): starting 28983 1726882972.44324: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726882972.45134: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882972.45201: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726882972.45233: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882972.45352: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882972.47123: stdout chunk (state=3): >>>/root <<< 28983 1726882972.47251: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882972.47337: stderr chunk (state=3): >>><<< 28983 1726882972.47341: stdout chunk (state=3): >>><<< 28983 1726882972.47363: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726882972.47471: _low_level_execute_command(): starting 28983 1726882972.47476: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882972.4737606-29101-213053831582957 `" && echo ansible-tmp-1726882972.4737606-29101-213053831582957="` echo /root/.ansible/tmp/ansible-tmp-1726882972.4737606-29101-213053831582957 `" ) && sleep 0' 28983 1726882972.48019: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726882972.48044: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726882972.48136: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882972.48154: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882972.48189: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726882972.48206: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726882972.48225: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882972.48330: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882972.50366: stdout chunk (state=3): >>>ansible-tmp-1726882972.4737606-29101-213053831582957=/root/.ansible/tmp/ansible-tmp-1726882972.4737606-29101-213053831582957 <<< 28983 1726882972.50550: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882972.50576: stdout chunk (state=3): >>><<< 28983 1726882972.50579: stderr chunk (state=3): >>><<< 28983 1726882972.50595: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882972.4737606-29101-213053831582957=/root/.ansible/tmp/ansible-tmp-1726882972.4737606-29101-213053831582957 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726882972.50649: variable 'ansible_module_compression' from source: unknown 28983 1726882972.50738: ANSIBALLZ: Using lock for stat 28983 1726882972.50742: ANSIBALLZ: Acquiring lock 28983 1726882972.50744: ANSIBALLZ: Lock acquired: 140284035738576 28983 1726882972.50747: ANSIBALLZ: Creating module 28983 1726882972.67403: ANSIBALLZ: Writing module into payload 28983 1726882972.67487: ANSIBALLZ: Writing module 28983 1726882972.67505: ANSIBALLZ: Renaming module 28983 1726882972.67512: ANSIBALLZ: Done creating module 28983 1726882972.67528: variable 'ansible_facts' from source: unknown 28983 1726882972.67578: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882972.4737606-29101-213053831582957/AnsiballZ_stat.py 28983 1726882972.67697: Sending initial data 28983 1726882972.67701: Sent initial data (153 bytes) 28983 1726882972.68167: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882972.68173: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882972.68176: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28983 1726882972.68178: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882972.68223: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726882972.68230: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882972.68331: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882972.70082: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726882972.70154: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726882972.70226: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpxr0_5bnx /root/.ansible/tmp/ansible-tmp-1726882972.4737606-29101-213053831582957/AnsiballZ_stat.py <<< 28983 1726882972.70230: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882972.4737606-29101-213053831582957/AnsiballZ_stat.py" <<< 28983 1726882972.70292: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpxr0_5bnx" to remote "/root/.ansible/tmp/ansible-tmp-1726882972.4737606-29101-213053831582957/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882972.4737606-29101-213053831582957/AnsiballZ_stat.py" <<< 28983 1726882972.71207: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882972.71259: stderr chunk (state=3): >>><<< 28983 1726882972.71262: stdout chunk (state=3): >>><<< 28983 1726882972.71361: done transferring module to remote 28983 1726882972.71364: _low_level_execute_command(): starting 28983 1726882972.71370: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882972.4737606-29101-213053831582957/ /root/.ansible/tmp/ansible-tmp-1726882972.4737606-29101-213053831582957/AnsiballZ_stat.py && sleep 0' 28983 1726882972.71991: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882972.71995: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882972.71998: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28983 1726882972.72000: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882972.72002: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726882972.72004: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726882972.72006: stderr chunk (state=3): >>>debug2: match found <<< 28983 1726882972.72008: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882972.72128: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726882972.72132: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726882972.72136: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882972.72205: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882972.74137: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882972.74144: stdout chunk (state=3): >>><<< 28983 1726882972.74150: stderr chunk (state=3): >>><<< 28983 1726882972.74163: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726882972.74167: _low_level_execute_command(): starting 28983 1726882972.74172: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882972.4737606-29101-213053831582957/AnsiballZ_stat.py && sleep 0' 28983 1726882972.74781: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726882972.74805: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882972.74915: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882972.77140: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 28983 1726882972.77172: stdout chunk (state=3): >>>import _imp # builtin <<< 28983 1726882972.77213: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 28983 1726882972.77294: stdout chunk (state=3): >>>import '_io' # <<< 28983 1726882972.77301: stdout chunk (state=3): >>>import 'marshal' # <<< 28983 1726882972.77333: stdout chunk (state=3): >>>import 'posix' # <<< 28983 1726882972.77373: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 28983 1726882972.77408: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # <<< 28983 1726882972.77413: stdout chunk (state=3): >>># installed zipimport hook <<< 28983 1726882972.77471: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py <<< 28983 1726882972.77482: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 28983 1726882972.77495: stdout chunk (state=3): >>>import '_codecs' # <<< 28983 1726882972.77520: stdout chunk (state=3): >>>import 'codecs' # <<< 28983 1726882972.77556: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 28983 1726882972.77595: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 28983 1726882972.77604: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2480c530> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a247dbb30> <<< 28983 1726882972.77630: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' <<< 28983 1726882972.77652: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2480eab0> <<< 28983 1726882972.77667: stdout chunk (state=3): >>>import '_signal' # <<< 28983 1726882972.77698: stdout chunk (state=3): >>>import '_abc' # <<< 28983 1726882972.77703: stdout chunk (state=3): >>>import 'abc' # <<< 28983 1726882972.77726: stdout chunk (state=3): >>>import 'io' # <<< 28983 1726882972.77763: stdout chunk (state=3): >>>import '_stat' # <<< 28983 1726882972.77768: stdout chunk (state=3): >>>import 'stat' # <<< 28983 1726882972.77859: stdout chunk (state=3): >>>import '_collections_abc' # <<< 28983 1726882972.77891: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 28983 1726882972.77922: stdout chunk (state=3): >>>import 'os' # <<< 28983 1726882972.77945: stdout chunk (state=3): >>>import '_sitebuiltins' # <<< 28983 1726882972.77958: stdout chunk (state=3): >>>Processing user site-packages Processing global site-packages <<< 28983 1726882972.77966: stdout chunk (state=3): >>>Adding directory: '/usr/lib64/python3.12/site-packages' <<< 28983 1726882972.77980: stdout chunk (state=3): >>>Adding directory: '/usr/lib/python3.12/site-packages' <<< 28983 1726882972.77993: stdout chunk (state=3): >>>Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 28983 1726882972.78022: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py<<< 28983 1726882972.78027: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 28983 1726882972.78058: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a24621160> <<< 28983 1726882972.78116: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 28983 1726882972.78134: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 28983 1726882972.78141: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a24621fd0> <<< 28983 1726882972.78169: stdout chunk (state=3): >>>import 'site' # <<< 28983 1726882972.78200: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 7 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 28983 1726882972.78455: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 28983 1726882972.78461: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 28983 1726882972.78490: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 28983 1726882972.78496: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 28983 1726882972.78517: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 28983 1726882972.78559: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 28983 1726882972.78586: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 28983 1726882972.78605: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 28983 1726882972.78624: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2465fe90> <<< 28983 1726882972.78645: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 28983 1726882972.78665: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 28983 1726882972.78692: stdout chunk (state=3): >>>import '_operator' # <<< 28983 1726882972.78700: stdout chunk (state=3): >>>import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2465ff50> <<< 28983 1726882972.78713: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 28983 1726882972.78748: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 28983 1726882972.78767: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 28983 1726882972.78823: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 28983 1726882972.78839: stdout chunk (state=3): >>>import 'itertools' # <<< 28983 1726882972.78865: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' <<< 28983 1726882972.78903: stdout chunk (state=3): >>>import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a24697860> <<< 28983 1726882972.78907: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py <<< 28983 1726882972.78909: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' <<< 28983 1726882972.78977: stdout chunk (state=3): >>>import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a24697ef0> import '_collections' # <<< 28983 1726882972.79004: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a24677b30> import '_functools' # <<< 28983 1726882972.79042: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a246751f0> <<< 28983 1726882972.79156: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2465d040> <<< 28983 1726882972.79186: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 28983 1726882972.79215: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # <<< 28983 1726882972.79229: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 28983 1726882972.79270: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 28983 1726882972.79294: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 28983 1726882972.79323: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a246bb800> <<< 28983 1726882972.79341: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a246ba420> <<< 28983 1726882972.79368: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a246762a0> <<< 28983 1726882972.79401: stdout chunk (state=3): >>>import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a246b8c80> <<< 28983 1726882972.79438: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 28983 1726882972.79450: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a246ec800> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2465c2c0><<< 28983 1726882972.79482: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 28983 1726882972.79514: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' <<< 28983 1726882972.79526: stdout chunk (state=3): >>># extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a246eccb0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a246ecb60> <<< 28983 1726882972.79556: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 28983 1726882972.79581: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a246ecf50> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2465ade0> <<< 28983 1726882972.79610: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 28983 1726882972.79638: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 28983 1726882972.79698: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 28983 1726882972.79701: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a246ed640> <<< 28983 1726882972.79707: stdout chunk (state=3): >>>import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a246ed310> <<< 28983 1726882972.79709: stdout chunk (state=3): >>>import 'importlib.machinery' # <<< 28983 1726882972.79731: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 28983 1726882972.79764: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a246ee540> <<< 28983 1726882972.79768: stdout chunk (state=3): >>>import 'importlib.util' # <<< 28983 1726882972.79829: stdout chunk (state=3): >>>import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 28983 1726882972.79866: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 28983 1726882972.79892: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a24704770> import 'errno' # <<< 28983 1726882972.79922: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 28983 1726882972.79965: stdout chunk (state=3): >>># extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a24705eb0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py <<< 28983 1726882972.79987: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 28983 1726882972.80012: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a24706d80> <<< 28983 1726882972.80064: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' <<< 28983 1726882972.80073: stdout chunk (state=3): >>># extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a247073b0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a247062d0> <<< 28983 1726882972.80099: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 28983 1726882972.80148: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 28983 1726882972.80158: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a24707e30> <<< 28983 1726882972.80166: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a24707560> <<< 28983 1726882972.80219: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a246ee5a0> <<< 28983 1726882972.80238: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 28983 1726882972.80270: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 28983 1726882972.80284: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 28983 1726882972.80308: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 28983 1726882972.80343: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a24487cb0> <<< 28983 1726882972.80368: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py <<< 28983 1726882972.80375: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 28983 1726882972.80400: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' <<< 28983 1726882972.80406: stdout chunk (state=3): >>># extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a244b0740> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a244b04a0> <<< 28983 1726882972.80437: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a244b0770> <<< 28983 1726882972.80471: stdout chunk (state=3): >>># extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' <<< 28983 1726882972.80483: stdout chunk (state=3): >>># extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a244b0950> <<< 28983 1726882972.80496: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a24485e50> <<< 28983 1726882972.80518: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 28983 1726882972.80638: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 28983 1726882972.80666: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 28983 1726882972.80678: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 28983 1726882972.80685: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a244b2030> <<< 28983 1726882972.80707: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a244b0cb0> <<< 28983 1726882972.80729: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a246eec90> <<< 28983 1726882972.80755: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 28983 1726882972.80815: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 28983 1726882972.80831: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 28983 1726882972.80885: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 28983 1726882972.80908: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a244e2390> <<< 28983 1726882972.80970: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 28983 1726882972.80977: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 28983 1726882972.81000: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 28983 1726882972.81021: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 28983 1726882972.81080: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a244fa540> <<< 28983 1726882972.81098: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 28983 1726882972.81147: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 28983 1726882972.81199: stdout chunk (state=3): >>>import 'ntpath' # <<< 28983 1726882972.81225: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a245332c0> <<< 28983 1726882972.81251: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 28983 1726882972.81289: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 28983 1726882972.81318: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 28983 1726882972.81357: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 28983 1726882972.81459: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a24559a60> <<< 28983 1726882972.81534: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a245333e0> <<< 28983 1726882972.81582: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a244fb1d0> <<< 28983 1726882972.81607: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py <<< 28983 1726882972.81616: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a243782f0> <<< 28983 1726882972.81627: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a244f9580> <<< 28983 1726882972.81639: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a244b2f30> <<< 28983 1726882972.81738: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 28983 1726882972.81763: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f2a24378500> <<< 28983 1726882972.81844: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_h9irvmt6/ansible_stat_payload.zip' <<< 28983 1726882972.81850: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.82003: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.82037: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 28983 1726882972.82044: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 28983 1726882972.82091: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 28983 1726882972.82172: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 28983 1726882972.82212: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a243cdfd0> <<< 28983 1726882972.82223: stdout chunk (state=3): >>>import '_typing' # <<< 28983 1726882972.82428: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a243a4ec0> <<< 28983 1726882972.82431: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2437bf50> # zipimport: zlib available <<< 28983 1726882972.82467: stdout chunk (state=3): >>>import 'ansible' # <<< 28983 1726882972.82475: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.82499: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.82515: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.82528: stdout chunk (state=3): >>>import 'ansible.module_utils' # <<< 28983 1726882972.82540: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.84137: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.85431: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py <<< 28983 1726882972.85440: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a243a7ce0> <<< 28983 1726882972.85463: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 28983 1726882972.85503: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 28983 1726882972.85532: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 28983 1726882972.85567: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a243f5940> <<< 28983 1726882972.85616: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a243f56d0> <<< 28983 1726882972.85647: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a243f4fe0> <<< 28983 1726882972.85674: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py <<< 28983 1726882972.85680: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 28983 1726882972.85726: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a243f5430> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a243cea50> <<< 28983 1726882972.85743: stdout chunk (state=3): >>>import 'atexit' # <<< 28983 1726882972.85769: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' <<< 28983 1726882972.85774: stdout chunk (state=3): >>># extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a243f6720> <<< 28983 1726882972.85801: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' <<< 28983 1726882972.85811: stdout chunk (state=3): >>># extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a243f68a0> <<< 28983 1726882972.85826: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 28983 1726882972.85887: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 28983 1726882972.85895: stdout chunk (state=3): >>>import '_locale' # <<< 28983 1726882972.85955: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a243f6db0> import 'pwd' # <<< 28983 1726882972.85980: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 28983 1726882972.86007: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 28983 1726882972.86054: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2425cc50> <<< 28983 1726882972.86078: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' <<< 28983 1726882972.86085: stdout chunk (state=3): >>># extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a2425e870> <<< 28983 1726882972.86103: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 28983 1726882972.86124: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 28983 1726882972.86162: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2425f1a0> <<< 28983 1726882972.86188: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 28983 1726882972.86212: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 28983 1726882972.86236: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a24260380> <<< 28983 1726882972.86255: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 28983 1726882972.86298: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 28983 1726882972.86320: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 28983 1726882972.86384: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a24262de0> <<< 28983 1726882972.86422: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a24262f00> <<< 28983 1726882972.86448: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a242610a0> <<< 28983 1726882972.86467: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 28983 1726882972.86503: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 28983 1726882972.86521: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py <<< 28983 1726882972.86527: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 28983 1726882972.86547: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 28983 1726882972.86586: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 28983 1726882972.86611: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 28983 1726882972.86617: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a24266cc0> <<< 28983 1726882972.86640: stdout chunk (state=3): >>>import '_tokenize' # <<< 28983 1726882972.86712: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a24265790> <<< 28983 1726882972.86723: stdout chunk (state=3): >>>import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a242654f0> <<< 28983 1726882972.86738: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py <<< 28983 1726882972.86747: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 28983 1726882972.86826: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a24267f20> <<< 28983 1726882972.86861: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a242615b0> <<< 28983 1726882972.86882: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' <<< 28983 1726882972.86891: stdout chunk (state=3): >>># extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a242aee10> <<< 28983 1726882972.86915: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py <<< 28983 1726882972.86922: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a242aefc0> <<< 28983 1726882972.86944: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 28983 1726882972.86967: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 28983 1726882972.86990: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py <<< 28983 1726882972.86996: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 28983 1726882972.87032: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' <<< 28983 1726882972.87039: stdout chunk (state=3): >>># extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a242b0b90> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a242b0950> <<< 28983 1726882972.87060: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 28983 1726882972.87188: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 28983 1726882972.87240: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a242b3110> <<< 28983 1726882972.87249: stdout chunk (state=3): >>>import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a242b1280> <<< 28983 1726882972.87267: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 28983 1726882972.87320: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 28983 1726882972.87351: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 28983 1726882972.87362: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' <<< 28983 1726882972.87365: stdout chunk (state=3): >>>import '_string' # <<< 28983 1726882972.87420: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a242ba900> <<< 28983 1726882972.87577: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a242b3290> <<< 28983 1726882972.87656: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' <<< 28983 1726882972.87668: stdout chunk (state=3): >>># extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a242bbc20> <<< 28983 1726882972.87701: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' <<< 28983 1726882972.87707: stdout chunk (state=3): >>># extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a242bbad0> <<< 28983 1726882972.87758: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' <<< 28983 1726882972.87769: stdout chunk (state=3): >>>import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a242bbb90> <<< 28983 1726882972.87788: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a242af2c0> <<< 28983 1726882972.87807: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py <<< 28983 1726882972.87818: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 28983 1726882972.87832: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 28983 1726882972.87864: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 28983 1726882972.87891: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 28983 1726882972.87925: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a242bf350> <<< 28983 1726882972.88118: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 28983 1726882972.88121: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a242c03b0> <<< 28983 1726882972.88145: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a242bdaf0> <<< 28983 1726882972.88171: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' <<< 28983 1726882972.88180: stdout chunk (state=3): >>># extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a242bee70> <<< 28983 1726882972.88196: stdout chunk (state=3): >>>import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a242bd700> <<< 28983 1726882972.88215: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.88220: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.88225: stdout chunk (state=3): >>>import 'ansible.module_utils.compat' # <<< 28983 1726882972.88241: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.88353: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.88466: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.88469: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # <<< 28983 1726882972.88498: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.88510: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.88516: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text' # <<< 28983 1726882972.88531: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.88679: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.88822: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.89524: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.90231: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 28983 1726882972.90237: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # <<< 28983 1726882972.90250: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 28983 1726882972.90271: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 28983 1726882972.90299: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 28983 1726882972.90360: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' <<< 28983 1726882972.90366: stdout chunk (state=3): >>># extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a243445f0> <<< 28983 1726882972.90482: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py <<< 28983 1726882972.90492: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 28983 1726882972.90513: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a24345370> <<< 28983 1726882972.90525: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a242c2ea0> <<< 28983 1726882972.90575: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 28983 1726882972.90594: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.90611: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.90633: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # <<< 28983 1726882972.90647: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.90828: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.91017: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 28983 1726882972.91021: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 28983 1726882972.91042: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a24345520> <<< 28983 1726882972.91051: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.91617: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.92175: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.92261: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.92358: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 28983 1726882972.92368: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.92411: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.92453: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 28983 1726882972.92468: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.92551: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.92669: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 28983 1726882972.92677: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.92699: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing' # <<< 28983 1726882972.92721: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.92765: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.92816: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 28983 1726882972.92822: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.93112: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.93400: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 28983 1726882972.93479: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 28983 1726882972.93487: stdout chunk (state=3): >>>import '_ast' # <<< 28983 1726882972.93587: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a24347d10> <<< 28983 1726882972.93594: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.93688: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.93771: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # <<< 28983 1726882972.93781: stdout chunk (state=3): >>>import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 28983 1726882972.93800: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # <<< 28983 1726882972.93820: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py <<< 28983 1726882972.93833: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 28983 1726882972.93919: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 28983 1726882972.94054: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a24155d90> <<< 28983 1726882972.94109: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a24156690> <<< 28983 1726882972.94125: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a24346b10> <<< 28983 1726882972.94142: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.94196: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.94231: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 28983 1726882972.94253: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.94296: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.94349: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.94408: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.94490: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 28983 1726882972.94530: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 28983 1726882972.94635: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a24155370> <<< 28983 1726882972.94681: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a241567e0> <<< 28983 1726882972.94712: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # <<< 28983 1726882972.94722: stdout chunk (state=3): >>>import 'ansible.module_utils.common.process' # <<< 28983 1726882972.94728: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.94804: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.94871: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.94905: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.94949: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py <<< 28983 1726882972.94961: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 28983 1726882972.94972: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 28983 1726882972.95002: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 28983 1726882972.95023: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 28983 1726882972.95091: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 28983 1726882972.95108: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 28983 1726882972.95131: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 28983 1726882972.95195: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a241e69f0> <<< 28983 1726882972.95250: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a24160740> <<< 28983 1726882972.95332: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2415a7b0> <<< 28983 1726882972.95347: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2415a600> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 28983 1726882972.95356: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.95385: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.95415: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 28983 1726882972.95484: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 28983 1726882972.95501: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.95508: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.modules' # <<< 28983 1726882972.95532: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.95699: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.95914: stdout chunk (state=3): >>># zipimport: zlib available <<< 28983 1726882972.96062: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 28983 1726882972.96068: stdout chunk (state=3): >>># destroy __main__ <<< 28983 1726882972.96418: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 28983 1726882972.96444: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ <<< 28983 1726882972.96457: stdout chunk (state=3): >>># clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 <<< 28983 1726882972.96473: stdout chunk (state=3): >>># cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib <<< 28983 1726882972.96502: stdout chunk (state=3): >>># cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno <<< 28983 1726882972.96505: stdout chunk (state=3): >>># cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse <<< 28983 1726882972.96517: stdout chunk (state=3): >>># destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil <<< 28983 1726882972.96538: stdout chunk (state=3): >>># destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd <<< 28983 1726882972.96562: stdout chunk (state=3): >>># cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid <<< 28983 1726882972.96567: stdout chunk (state=3): >>># cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common <<< 28983 1726882972.96583: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy <<< 28983 1726882972.96605: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast <<< 28983 1726882972.96611: stdout chunk (state=3): >>># cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info <<< 28983 1726882972.96624: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic <<< 28983 1726882972.96632: stdout chunk (state=3): >>># cleanup[2] removing ansible.modules # destroy ansible.modules <<< 28983 1726882972.96882: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 28983 1726882972.96907: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 28983 1726882972.96914: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy struct <<< 28983 1726882972.96942: stdout chunk (state=3): >>># destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile <<< 28983 1726882972.96950: stdout chunk (state=3): >>># destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress <<< 28983 1726882972.96986: stdout chunk (state=3): >>># destroy ntpath <<< 28983 1726882972.97009: stdout chunk (state=3): >>># destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux <<< 28983 1726882972.97020: stdout chunk (state=3): >>># destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings <<< 28983 1726882972.97033: stdout chunk (state=3): >>># destroy _locale # destroy pwd<<< 28983 1726882972.97062: stdout chunk (state=3): >>> <<< 28983 1726882972.97066: stdout chunk (state=3): >>># destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess <<< 28983 1726882972.97085: stdout chunk (state=3): >>># destroy syslog # destroy uuid <<< 28983 1726882972.97102: stdout chunk (state=3): >>># destroy selectors # destroy errno <<< 28983 1726882972.97130: stdout chunk (state=3): >>># destroy array # destroy datetime <<< 28983 1726882972.97141: stdout chunk (state=3): >>># destroy _hashlib <<< 28983 1726882972.97146: stdout chunk (state=3): >>># destroy _blake2 # destroy selinux # destroy shutil <<< 28983 1726882972.97155: stdout chunk (state=3): >>># destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess <<< 28983 1726882972.97214: stdout chunk (state=3): >>># cleanup[3] wiping selinux._selinux <<< 28983 1726882972.97244: stdout chunk (state=3): >>># cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket <<< 28983 1726882972.97267: stdout chunk (state=3): >>># cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform <<< 28983 1726882972.97290: stdout chunk (state=3): >>># cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect <<< 28983 1726882972.97301: stdout chunk (state=3): >>># cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external <<< 28983 1726882972.97323: stdout chunk (state=3): >>># cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser <<< 28983 1726882972.97348: stdout chunk (state=3): >>># cleanup[3] wiping _sre # cleanup[3] wiping functools <<< 28983 1726882972.97365: stdout chunk (state=3): >>># cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types <<< 28983 1726882972.97387: stdout chunk (state=3): >>># cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat <<< 28983 1726882972.97408: stdout chunk (state=3): >>># destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external <<< 28983 1726882972.97422: stdout chunk (state=3): >>># cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib <<< 28983 1726882972.97431: stdout chunk (state=3): >>># cleanup[3] wiping sys <<< 28983 1726882972.97451: stdout chunk (state=3): >>># cleanup[3] wiping builtins <<< 28983 1726882972.97455: stdout chunk (state=3): >>># destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 28983 1726882972.97605: stdout chunk (state=3): >>># destroy sys.monitoring <<< 28983 1726882972.97616: stdout chunk (state=3): >>># destroy _socket <<< 28983 1726882972.97622: stdout chunk (state=3): >>># destroy _collections <<< 28983 1726882972.97657: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser <<< 28983 1726882972.97668: stdout chunk (state=3): >>># destroy tokenize <<< 28983 1726882972.97688: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 28983 1726882972.97723: stdout chunk (state=3): >>># destroy _typing <<< 28983 1726882972.97731: stdout chunk (state=3): >>># destroy _tokenize <<< 28983 1726882972.97736: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves <<< 28983 1726882972.97753: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 28983 1726882972.97767: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 28983 1726882972.97892: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases <<< 28983 1726882972.97896: stdout chunk (state=3): >>># destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading <<< 28983 1726882972.97905: stdout chunk (state=3): >>># destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time <<< 28983 1726882972.97939: stdout chunk (state=3): >>># destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _string # destroy re <<< 28983 1726882972.97980: stdout chunk (state=3): >>># destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread <<< 28983 1726882972.97996: stdout chunk (state=3): >>># clear sys.audit hooks <<< 28983 1726882972.98477: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882972.98491: stderr chunk (state=3): >>>Shared connection to 10.31.46.139 closed. <<< 28983 1726882972.98546: stdout chunk (state=3): >>><<< 28983 1726882972.98549: stderr chunk (state=3): >>><<< 28983 1726882972.98610: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2480c530> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a247dbb30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2480eab0> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a24621160> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a24621fd0> import 'site' # Python 3.12.5 (main, Aug 7 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2465fe90> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2465ff50> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a24697860> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a24697ef0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a24677b30> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a246751f0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2465d040> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a246bb800> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a246ba420> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a246762a0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a246b8c80> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a246ec800> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2465c2c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a246eccb0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a246ecb60> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a246ecf50> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2465ade0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a246ed640> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a246ed310> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a246ee540> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a24704770> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a24705eb0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a24706d80> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a247073b0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a247062d0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a24707e30> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a24707560> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a246ee5a0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a24487cb0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a244b0740> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a244b04a0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a244b0770> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a244b0950> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a24485e50> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a244b2030> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a244b0cb0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a246eec90> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a244e2390> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a244fa540> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a245332c0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a24559a60> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a245333e0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a244fb1d0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a243782f0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a244f9580> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a244b2f30> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f2a24378500> # zipimport: found 30 names in '/tmp/ansible_stat_payload_h9irvmt6/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a243cdfd0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a243a4ec0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2437bf50> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a243a7ce0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a243f5940> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a243f56d0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a243f4fe0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a243f5430> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a243cea50> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a243f6720> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a243f68a0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a243f6db0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2425cc50> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a2425e870> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2425f1a0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a24260380> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a24262de0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a24262f00> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a242610a0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a24266cc0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a24265790> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a242654f0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a24267f20> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a242615b0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a242aee10> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a242aefc0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a242b0b90> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a242b0950> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a242b3110> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a242b1280> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a242ba900> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a242b3290> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a242bbc20> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a242bbad0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a242bbb90> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a242af2c0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a242bf350> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a242c03b0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a242bdaf0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a242bee70> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a242bd700> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a243445f0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a24345370> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a242c2ea0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a24345520> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a24347d10> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a24155d90> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a24156690> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a24346b10> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a24155370> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a241567e0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a241e69f0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a24160740> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2415a7b0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2415a600> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 28983 1726882972.99619: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882972.4737606-29101-213053831582957/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726882972.99622: _low_level_execute_command(): starting 28983 1726882972.99625: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882972.4737606-29101-213053831582957/ > /dev/null 2>&1 && sleep 0' 28983 1726882972.99721: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726882972.99743: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found <<< 28983 1726882972.99747: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882972.99809: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726882972.99814: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726882972.99816: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882972.99887: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882973.01953: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882973.01957: stdout chunk (state=3): >>><<< 28983 1726882973.01959: stderr chunk (state=3): >>><<< 28983 1726882973.01985: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726882973.02139: handler run complete 28983 1726882973.02142: attempt loop complete, returning result 28983 1726882973.02145: _execute() done 28983 1726882973.02147: dumping result to json 28983 1726882973.02149: done dumping result, returning 28983 1726882973.02151: done running TaskExecutor() for managed_node2/TASK: Check if system is ostree [0affe814-3a2d-b16d-c0a7-00000000002e] 28983 1726882973.02153: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000002e 28983 1726882973.02222: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000002e 28983 1726882973.02225: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 28983 1726882973.02304: no more pending results, returning what we have 28983 1726882973.02308: results queue empty 28983 1726882973.02309: checking for any_errors_fatal 28983 1726882973.02317: done checking for any_errors_fatal 28983 1726882973.02318: checking for max_fail_percentage 28983 1726882973.02321: done checking for max_fail_percentage 28983 1726882973.02322: checking to see if all hosts have failed and the running result is not ok 28983 1726882973.02323: done checking to see if all hosts have failed 28983 1726882973.02324: getting the remaining hosts for this loop 28983 1726882973.02326: done getting the remaining hosts for this loop 28983 1726882973.02330: getting the next task for host managed_node2 28983 1726882973.02345: done getting next task for host managed_node2 28983 1726882973.02348: ^ task is: TASK: Set flag to indicate system is ostree 28983 1726882973.02352: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882973.02356: getting variables 28983 1726882973.02357: in VariableManager get_vars() 28983 1726882973.02391: Calling all_inventory to load vars for managed_node2 28983 1726882973.02394: Calling groups_inventory to load vars for managed_node2 28983 1726882973.02399: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882973.02410: Calling all_plugins_play to load vars for managed_node2 28983 1726882973.02414: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882973.02418: Calling groups_plugins_play to load vars for managed_node2 28983 1726882973.03070: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882973.03665: done with get_vars() 28983 1726882973.03681: done getting variables 28983 1726882973.03793: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Friday 20 September 2024 21:42:53 -0400 (0:00:00.626) 0:00:03.036 ****** 28983 1726882973.03826: entering _queue_task() for managed_node2/set_fact 28983 1726882973.03828: Creating lock for set_fact 28983 1726882973.04105: worker is 1 (out of 1 available) 28983 1726882973.04117: exiting _queue_task() for managed_node2/set_fact 28983 1726882973.04131: done queuing things up, now waiting for results queue to drain 28983 1726882973.04133: waiting for pending results... 28983 1726882973.04464: running TaskExecutor() for managed_node2/TASK: Set flag to indicate system is ostree 28983 1726882973.04545: in run() - task 0affe814-3a2d-b16d-c0a7-00000000002f 28983 1726882973.04549: variable 'ansible_search_path' from source: unknown 28983 1726882973.04552: variable 'ansible_search_path' from source: unknown 28983 1726882973.04560: calling self._execute() 28983 1726882973.04627: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882973.04637: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882973.04669: variable 'omit' from source: magic vars 28983 1726882973.05262: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726882973.05548: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726882973.05652: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726882973.05655: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726882973.05681: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726882973.05788: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726882973.05818: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726882973.05858: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726882973.05894: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726882973.06030: Evaluated conditional (not __network_is_ostree is defined): True 28983 1726882973.06086: variable 'omit' from source: magic vars 28983 1726882973.06090: variable 'omit' from source: magic vars 28983 1726882973.06236: variable '__ostree_booted_stat' from source: set_fact 28983 1726882973.06300: variable 'omit' from source: magic vars 28983 1726882973.06332: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726882973.06373: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726882973.06404: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726882973.06430: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882973.06448: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882973.06504: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726882973.06507: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882973.06510: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882973.06722: Set connection var ansible_connection to ssh 28983 1726882973.06726: Set connection var ansible_shell_executable to /bin/sh 28983 1726882973.06728: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726882973.06731: Set connection var ansible_timeout to 10 28983 1726882973.06733: Set connection var ansible_pipelining to False 28983 1726882973.06737: Set connection var ansible_shell_type to sh 28983 1726882973.06739: variable 'ansible_shell_executable' from source: unknown 28983 1726882973.06742: variable 'ansible_connection' from source: unknown 28983 1726882973.06744: variable 'ansible_module_compression' from source: unknown 28983 1726882973.06746: variable 'ansible_shell_type' from source: unknown 28983 1726882973.06748: variable 'ansible_shell_executable' from source: unknown 28983 1726882973.06750: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882973.06759: variable 'ansible_pipelining' from source: unknown 28983 1726882973.06767: variable 'ansible_timeout' from source: unknown 28983 1726882973.06776: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882973.06905: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726882973.06922: variable 'omit' from source: magic vars 28983 1726882973.06935: starting attempt loop 28983 1726882973.06948: running the handler 28983 1726882973.06966: handler run complete 28983 1726882973.06983: attempt loop complete, returning result 28983 1726882973.06991: _execute() done 28983 1726882973.06998: dumping result to json 28983 1726882973.07006: done dumping result, returning 28983 1726882973.07017: done running TaskExecutor() for managed_node2/TASK: Set flag to indicate system is ostree [0affe814-3a2d-b16d-c0a7-00000000002f] 28983 1726882973.07025: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000002f ok: [managed_node2] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 28983 1726882973.07214: no more pending results, returning what we have 28983 1726882973.07217: results queue empty 28983 1726882973.07218: checking for any_errors_fatal 28983 1726882973.07224: done checking for any_errors_fatal 28983 1726882973.07226: checking for max_fail_percentage 28983 1726882973.07228: done checking for max_fail_percentage 28983 1726882973.07229: checking to see if all hosts have failed and the running result is not ok 28983 1726882973.07230: done checking to see if all hosts have failed 28983 1726882973.07231: getting the remaining hosts for this loop 28983 1726882973.07233: done getting the remaining hosts for this loop 28983 1726882973.07240: getting the next task for host managed_node2 28983 1726882973.07251: done getting next task for host managed_node2 28983 1726882973.07254: ^ task is: TASK: Fix CentOS6 Base repo 28983 1726882973.07257: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882973.07261: getting variables 28983 1726882973.07263: in VariableManager get_vars() 28983 1726882973.07299: Calling all_inventory to load vars for managed_node2 28983 1726882973.07302: Calling groups_inventory to load vars for managed_node2 28983 1726882973.07306: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882973.07317: Calling all_plugins_play to load vars for managed_node2 28983 1726882973.07321: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882973.07325: Calling groups_plugins_play to load vars for managed_node2 28983 1726882973.07800: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000002f 28983 1726882973.07809: WORKER PROCESS EXITING 28983 1726882973.07837: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882973.08164: done with get_vars() 28983 1726882973.08175: done getting variables 28983 1726882973.08305: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Friday 20 September 2024 21:42:53 -0400 (0:00:00.045) 0:00:03.081 ****** 28983 1726882973.08338: entering _queue_task() for managed_node2/copy 28983 1726882973.08573: worker is 1 (out of 1 available) 28983 1726882973.08588: exiting _queue_task() for managed_node2/copy 28983 1726882973.08598: done queuing things up, now waiting for results queue to drain 28983 1726882973.08600: waiting for pending results... 28983 1726882973.08847: running TaskExecutor() for managed_node2/TASK: Fix CentOS6 Base repo 28983 1726882973.08959: in run() - task 0affe814-3a2d-b16d-c0a7-000000000031 28983 1726882973.08985: variable 'ansible_search_path' from source: unknown 28983 1726882973.08993: variable 'ansible_search_path' from source: unknown 28983 1726882973.09077: calling self._execute() 28983 1726882973.09124: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882973.09139: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882973.09155: variable 'omit' from source: magic vars 28983 1726882973.09693: variable 'ansible_distribution' from source: facts 28983 1726882973.09719: Evaluated conditional (ansible_distribution == 'CentOS'): False 28983 1726882973.09732: when evaluation is False, skipping this task 28983 1726882973.09839: _execute() done 28983 1726882973.09842: dumping result to json 28983 1726882973.09847: done dumping result, returning 28983 1726882973.09850: done running TaskExecutor() for managed_node2/TASK: Fix CentOS6 Base repo [0affe814-3a2d-b16d-c0a7-000000000031] 28983 1726882973.09852: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000031 28983 1726882973.09924: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000031 28983 1726882973.09928: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution == 'CentOS'", "skip_reason": "Conditional result was False" } 28983 1726882973.10002: no more pending results, returning what we have 28983 1726882973.10007: results queue empty 28983 1726882973.10008: checking for any_errors_fatal 28983 1726882973.10013: done checking for any_errors_fatal 28983 1726882973.10014: checking for max_fail_percentage 28983 1726882973.10016: done checking for max_fail_percentage 28983 1726882973.10018: checking to see if all hosts have failed and the running result is not ok 28983 1726882973.10019: done checking to see if all hosts have failed 28983 1726882973.10020: getting the remaining hosts for this loop 28983 1726882973.10022: done getting the remaining hosts for this loop 28983 1726882973.10026: getting the next task for host managed_node2 28983 1726882973.10033: done getting next task for host managed_node2 28983 1726882973.10039: ^ task is: TASK: Include the task 'enable_epel.yml' 28983 1726882973.10043: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882973.10048: getting variables 28983 1726882973.10049: in VariableManager get_vars() 28983 1726882973.10081: Calling all_inventory to load vars for managed_node2 28983 1726882973.10084: Calling groups_inventory to load vars for managed_node2 28983 1726882973.10089: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882973.10102: Calling all_plugins_play to load vars for managed_node2 28983 1726882973.10106: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882973.10110: Calling groups_plugins_play to load vars for managed_node2 28983 1726882973.10520: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882973.10844: done with get_vars() 28983 1726882973.10855: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Friday 20 September 2024 21:42:53 -0400 (0:00:00.026) 0:00:03.107 ****** 28983 1726882973.10955: entering _queue_task() for managed_node2/include_tasks 28983 1726882973.11171: worker is 1 (out of 1 available) 28983 1726882973.11186: exiting _queue_task() for managed_node2/include_tasks 28983 1726882973.11196: done queuing things up, now waiting for results queue to drain 28983 1726882973.11198: waiting for pending results... 28983 1726882973.11649: running TaskExecutor() for managed_node2/TASK: Include the task 'enable_epel.yml' 28983 1726882973.11654: in run() - task 0affe814-3a2d-b16d-c0a7-000000000032 28983 1726882973.11657: variable 'ansible_search_path' from source: unknown 28983 1726882973.11660: variable 'ansible_search_path' from source: unknown 28983 1726882973.11663: calling self._execute() 28983 1726882973.11674: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882973.11691: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882973.11708: variable 'omit' from source: magic vars 28983 1726882973.12274: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726882973.15069: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726882973.15154: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726882973.15202: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726882973.15254: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726882973.15294: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726882973.15394: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726882973.15433: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726882973.15476: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726882973.15538: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726882973.15561: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726882973.15699: variable '__network_is_ostree' from source: set_fact 28983 1726882973.15722: Evaluated conditional (not __network_is_ostree | d(false)): True 28983 1726882973.15732: _execute() done 28983 1726882973.15742: dumping result to json 28983 1726882973.15795: done dumping result, returning 28983 1726882973.15799: done running TaskExecutor() for managed_node2/TASK: Include the task 'enable_epel.yml' [0affe814-3a2d-b16d-c0a7-000000000032] 28983 1726882973.15801: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000032 28983 1726882973.15928: no more pending results, returning what we have 28983 1726882973.15935: in VariableManager get_vars() 28983 1726882973.15971: Calling all_inventory to load vars for managed_node2 28983 1726882973.15974: Calling groups_inventory to load vars for managed_node2 28983 1726882973.15981: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882973.15992: Calling all_plugins_play to load vars for managed_node2 28983 1726882973.15995: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882973.15999: Calling groups_plugins_play to load vars for managed_node2 28983 1726882973.16417: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000032 28983 1726882973.16421: WORKER PROCESS EXITING 28983 1726882973.16448: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882973.16767: done with get_vars() 28983 1726882973.16776: variable 'ansible_search_path' from source: unknown 28983 1726882973.16778: variable 'ansible_search_path' from source: unknown 28983 1726882973.16823: we have included files to process 28983 1726882973.16824: generating all_blocks data 28983 1726882973.16826: done generating all_blocks data 28983 1726882973.16832: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 28983 1726882973.16836: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 28983 1726882973.16839: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 28983 1726882973.17891: done processing included file 28983 1726882973.17894: iterating over new_blocks loaded from include file 28983 1726882973.17895: in VariableManager get_vars() 28983 1726882973.17908: done with get_vars() 28983 1726882973.17910: filtering new block on tags 28983 1726882973.17940: done filtering new block on tags 28983 1726882973.17943: in VariableManager get_vars() 28983 1726882973.17956: done with get_vars() 28983 1726882973.17958: filtering new block on tags 28983 1726882973.17972: done filtering new block on tags 28983 1726882973.17974: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed_node2 28983 1726882973.17982: extending task lists for all hosts with included blocks 28983 1726882973.18121: done extending task lists 28983 1726882973.18123: done processing included files 28983 1726882973.18124: results queue empty 28983 1726882973.18125: checking for any_errors_fatal 28983 1726882973.18128: done checking for any_errors_fatal 28983 1726882973.18129: checking for max_fail_percentage 28983 1726882973.18130: done checking for max_fail_percentage 28983 1726882973.18131: checking to see if all hosts have failed and the running result is not ok 28983 1726882973.18132: done checking to see if all hosts have failed 28983 1726882973.18133: getting the remaining hosts for this loop 28983 1726882973.18136: done getting the remaining hosts for this loop 28983 1726882973.18138: getting the next task for host managed_node2 28983 1726882973.18143: done getting next task for host managed_node2 28983 1726882973.18146: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 28983 1726882973.18149: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882973.18151: getting variables 28983 1726882973.18152: in VariableManager get_vars() 28983 1726882973.18161: Calling all_inventory to load vars for managed_node2 28983 1726882973.18164: Calling groups_inventory to load vars for managed_node2 28983 1726882973.18167: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882973.18172: Calling all_plugins_play to load vars for managed_node2 28983 1726882973.18182: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882973.18186: Calling groups_plugins_play to load vars for managed_node2 28983 1726882973.18421: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882973.18752: done with get_vars() 28983 1726882973.18762: done getting variables 28983 1726882973.18838: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 28983 1726882973.19058: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 39] ********************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Friday 20 September 2024 21:42:53 -0400 (0:00:00.081) 0:00:03.188 ****** 28983 1726882973.19108: entering _queue_task() for managed_node2/command 28983 1726882973.19110: Creating lock for command 28983 1726882973.19347: worker is 1 (out of 1 available) 28983 1726882973.19359: exiting _queue_task() for managed_node2/command 28983 1726882973.19370: done queuing things up, now waiting for results queue to drain 28983 1726882973.19372: waiting for pending results... 28983 1726882973.19627: running TaskExecutor() for managed_node2/TASK: Create EPEL 39 28983 1726882973.19756: in run() - task 0affe814-3a2d-b16d-c0a7-00000000004c 28983 1726882973.19776: variable 'ansible_search_path' from source: unknown 28983 1726882973.19787: variable 'ansible_search_path' from source: unknown 28983 1726882973.19826: calling self._execute() 28983 1726882973.19916: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882973.19930: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882973.19948: variable 'omit' from source: magic vars 28983 1726882973.20392: variable 'ansible_distribution' from source: facts 28983 1726882973.20411: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 28983 1726882973.20419: when evaluation is False, skipping this task 28983 1726882973.20428: _execute() done 28983 1726882973.20438: dumping result to json 28983 1726882973.20446: done dumping result, returning 28983 1726882973.20458: done running TaskExecutor() for managed_node2/TASK: Create EPEL 39 [0affe814-3a2d-b16d-c0a7-00000000004c] 28983 1726882973.20468: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000004c skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 28983 1726882973.20672: no more pending results, returning what we have 28983 1726882973.20675: results queue empty 28983 1726882973.20676: checking for any_errors_fatal 28983 1726882973.20681: done checking for any_errors_fatal 28983 1726882973.20682: checking for max_fail_percentage 28983 1726882973.20684: done checking for max_fail_percentage 28983 1726882973.20685: checking to see if all hosts have failed and the running result is not ok 28983 1726882973.20686: done checking to see if all hosts have failed 28983 1726882973.20687: getting the remaining hosts for this loop 28983 1726882973.20689: done getting the remaining hosts for this loop 28983 1726882973.20693: getting the next task for host managed_node2 28983 1726882973.20700: done getting next task for host managed_node2 28983 1726882973.20703: ^ task is: TASK: Install yum-utils package 28983 1726882973.20707: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882973.20711: getting variables 28983 1726882973.20713: in VariableManager get_vars() 28983 1726882973.20747: Calling all_inventory to load vars for managed_node2 28983 1726882973.20750: Calling groups_inventory to load vars for managed_node2 28983 1726882973.20755: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882973.20768: Calling all_plugins_play to load vars for managed_node2 28983 1726882973.20772: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882973.20776: Calling groups_plugins_play to load vars for managed_node2 28983 1726882973.21149: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000004c 28983 1726882973.21153: WORKER PROCESS EXITING 28983 1726882973.21272: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882973.21633: done with get_vars() 28983 1726882973.21646: done getting variables 28983 1726882973.21763: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Friday 20 September 2024 21:42:53 -0400 (0:00:00.026) 0:00:03.215 ****** 28983 1726882973.21794: entering _queue_task() for managed_node2/package 28983 1726882973.21796: Creating lock for package 28983 1726882973.22069: worker is 1 (out of 1 available) 28983 1726882973.22082: exiting _queue_task() for managed_node2/package 28983 1726882973.22096: done queuing things up, now waiting for results queue to drain 28983 1726882973.22098: waiting for pending results... 28983 1726882973.22373: running TaskExecutor() for managed_node2/TASK: Install yum-utils package 28983 1726882973.22526: in run() - task 0affe814-3a2d-b16d-c0a7-00000000004d 28983 1726882973.22549: variable 'ansible_search_path' from source: unknown 28983 1726882973.22557: variable 'ansible_search_path' from source: unknown 28983 1726882973.22604: calling self._execute() 28983 1726882973.22698: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882973.22712: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882973.22743: variable 'omit' from source: magic vars 28983 1726882973.23216: variable 'ansible_distribution' from source: facts 28983 1726882973.23238: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 28983 1726882973.23247: when evaluation is False, skipping this task 28983 1726882973.23270: _execute() done 28983 1726882973.23539: dumping result to json 28983 1726882973.23545: done dumping result, returning 28983 1726882973.23548: done running TaskExecutor() for managed_node2/TASK: Install yum-utils package [0affe814-3a2d-b16d-c0a7-00000000004d] 28983 1726882973.23550: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000004d 28983 1726882973.23614: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000004d 28983 1726882973.23617: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 28983 1726882973.23676: no more pending results, returning what we have 28983 1726882973.23680: results queue empty 28983 1726882973.23681: checking for any_errors_fatal 28983 1726882973.23692: done checking for any_errors_fatal 28983 1726882973.23693: checking for max_fail_percentage 28983 1726882973.23695: done checking for max_fail_percentage 28983 1726882973.23696: checking to see if all hosts have failed and the running result is not ok 28983 1726882973.23697: done checking to see if all hosts have failed 28983 1726882973.23698: getting the remaining hosts for this loop 28983 1726882973.23700: done getting the remaining hosts for this loop 28983 1726882973.23704: getting the next task for host managed_node2 28983 1726882973.23712: done getting next task for host managed_node2 28983 1726882973.23715: ^ task is: TASK: Enable EPEL 7 28983 1726882973.23720: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882973.23724: getting variables 28983 1726882973.23726: in VariableManager get_vars() 28983 1726882973.23759: Calling all_inventory to load vars for managed_node2 28983 1726882973.23762: Calling groups_inventory to load vars for managed_node2 28983 1726882973.23767: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882973.23780: Calling all_plugins_play to load vars for managed_node2 28983 1726882973.23784: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882973.23788: Calling groups_plugins_play to load vars for managed_node2 28983 1726882973.24182: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882973.24517: done with get_vars() 28983 1726882973.24529: done getting variables 28983 1726882973.24602: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Friday 20 September 2024 21:42:53 -0400 (0:00:00.028) 0:00:03.244 ****** 28983 1726882973.24638: entering _queue_task() for managed_node2/command 28983 1726882973.24869: worker is 1 (out of 1 available) 28983 1726882973.24880: exiting _queue_task() for managed_node2/command 28983 1726882973.24891: done queuing things up, now waiting for results queue to drain 28983 1726882973.24893: waiting for pending results... 28983 1726882973.25268: running TaskExecutor() for managed_node2/TASK: Enable EPEL 7 28983 1726882973.25303: in run() - task 0affe814-3a2d-b16d-c0a7-00000000004e 28983 1726882973.25323: variable 'ansible_search_path' from source: unknown 28983 1726882973.25331: variable 'ansible_search_path' from source: unknown 28983 1726882973.25385: calling self._execute() 28983 1726882973.25488: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882973.25539: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882973.25542: variable 'omit' from source: magic vars 28983 1726882973.25997: variable 'ansible_distribution' from source: facts 28983 1726882973.26025: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 28983 1726882973.26036: when evaluation is False, skipping this task 28983 1726882973.26045: _execute() done 28983 1726882973.26053: dumping result to json 28983 1726882973.26121: done dumping result, returning 28983 1726882973.26125: done running TaskExecutor() for managed_node2/TASK: Enable EPEL 7 [0affe814-3a2d-b16d-c0a7-00000000004e] 28983 1726882973.26132: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000004e 28983 1726882973.26198: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000004e 28983 1726882973.26201: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 28983 1726882973.26259: no more pending results, returning what we have 28983 1726882973.26263: results queue empty 28983 1726882973.26264: checking for any_errors_fatal 28983 1726882973.26271: done checking for any_errors_fatal 28983 1726882973.26272: checking for max_fail_percentage 28983 1726882973.26274: done checking for max_fail_percentage 28983 1726882973.26274: checking to see if all hosts have failed and the running result is not ok 28983 1726882973.26275: done checking to see if all hosts have failed 28983 1726882973.26276: getting the remaining hosts for this loop 28983 1726882973.26278: done getting the remaining hosts for this loop 28983 1726882973.26282: getting the next task for host managed_node2 28983 1726882973.26289: done getting next task for host managed_node2 28983 1726882973.26293: ^ task is: TASK: Enable EPEL 8 28983 1726882973.26297: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882973.26301: getting variables 28983 1726882973.26302: in VariableManager get_vars() 28983 1726882973.26333: Calling all_inventory to load vars for managed_node2 28983 1726882973.26339: Calling groups_inventory to load vars for managed_node2 28983 1726882973.26343: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882973.26467: Calling all_plugins_play to load vars for managed_node2 28983 1726882973.26471: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882973.26474: Calling groups_plugins_play to load vars for managed_node2 28983 1726882973.26802: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882973.27089: done with get_vars() 28983 1726882973.27099: done getting variables 28983 1726882973.27164: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Friday 20 September 2024 21:42:53 -0400 (0:00:00.025) 0:00:03.269 ****** 28983 1726882973.27194: entering _queue_task() for managed_node2/command 28983 1726882973.27477: worker is 1 (out of 1 available) 28983 1726882973.27488: exiting _queue_task() for managed_node2/command 28983 1726882973.27498: done queuing things up, now waiting for results queue to drain 28983 1726882973.27500: waiting for pending results... 28983 1726882973.27782: running TaskExecutor() for managed_node2/TASK: Enable EPEL 8 28983 1726882973.27840: in run() - task 0affe814-3a2d-b16d-c0a7-00000000004f 28983 1726882973.27844: variable 'ansible_search_path' from source: unknown 28983 1726882973.27847: variable 'ansible_search_path' from source: unknown 28983 1726882973.27893: calling self._execute() 28983 1726882973.27972: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882973.28040: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882973.28044: variable 'omit' from source: magic vars 28983 1726882973.28461: variable 'ansible_distribution' from source: facts 28983 1726882973.28482: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 28983 1726882973.28495: when evaluation is False, skipping this task 28983 1726882973.28504: _execute() done 28983 1726882973.28513: dumping result to json 28983 1726882973.28523: done dumping result, returning 28983 1726882973.28645: done running TaskExecutor() for managed_node2/TASK: Enable EPEL 8 [0affe814-3a2d-b16d-c0a7-00000000004f] 28983 1726882973.28649: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000004f 28983 1726882973.28717: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000004f 28983 1726882973.28720: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 28983 1726882973.28800: no more pending results, returning what we have 28983 1726882973.28804: results queue empty 28983 1726882973.28805: checking for any_errors_fatal 28983 1726882973.28812: done checking for any_errors_fatal 28983 1726882973.28813: checking for max_fail_percentage 28983 1726882973.28815: done checking for max_fail_percentage 28983 1726882973.28816: checking to see if all hosts have failed and the running result is not ok 28983 1726882973.28817: done checking to see if all hosts have failed 28983 1726882973.28819: getting the remaining hosts for this loop 28983 1726882973.28821: done getting the remaining hosts for this loop 28983 1726882973.28825: getting the next task for host managed_node2 28983 1726882973.28839: done getting next task for host managed_node2 28983 1726882973.28842: ^ task is: TASK: Enable EPEL 6 28983 1726882973.28847: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882973.28851: getting variables 28983 1726882973.28853: in VariableManager get_vars() 28983 1726882973.28893: Calling all_inventory to load vars for managed_node2 28983 1726882973.28897: Calling groups_inventory to load vars for managed_node2 28983 1726882973.28901: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882973.28915: Calling all_plugins_play to load vars for managed_node2 28983 1726882973.28919: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882973.28923: Calling groups_plugins_play to load vars for managed_node2 28983 1726882973.29321: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882973.29656: done with get_vars() 28983 1726882973.29667: done getting variables 28983 1726882973.29735: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Friday 20 September 2024 21:42:53 -0400 (0:00:00.025) 0:00:03.295 ****** 28983 1726882973.29769: entering _queue_task() for managed_node2/copy 28983 1726882973.30008: worker is 1 (out of 1 available) 28983 1726882973.30020: exiting _queue_task() for managed_node2/copy 28983 1726882973.30031: done queuing things up, now waiting for results queue to drain 28983 1726882973.30033: waiting for pending results... 28983 1726882973.30281: running TaskExecutor() for managed_node2/TASK: Enable EPEL 6 28983 1726882973.30440: in run() - task 0affe814-3a2d-b16d-c0a7-000000000051 28983 1726882973.30444: variable 'ansible_search_path' from source: unknown 28983 1726882973.30447: variable 'ansible_search_path' from source: unknown 28983 1726882973.30479: calling self._execute() 28983 1726882973.30568: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882973.30642: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882973.30645: variable 'omit' from source: magic vars 28983 1726882973.31150: variable 'ansible_distribution' from source: facts 28983 1726882973.31168: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 28983 1726882973.31177: when evaluation is False, skipping this task 28983 1726882973.31185: _execute() done 28983 1726882973.31193: dumping result to json 28983 1726882973.31205: done dumping result, returning 28983 1726882973.31221: done running TaskExecutor() for managed_node2/TASK: Enable EPEL 6 [0affe814-3a2d-b16d-c0a7-000000000051] 28983 1726882973.31232: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000051 28983 1726882973.31475: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000051 28983 1726882973.31479: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 28983 1726882973.31520: no more pending results, returning what we have 28983 1726882973.31524: results queue empty 28983 1726882973.31525: checking for any_errors_fatal 28983 1726882973.31529: done checking for any_errors_fatal 28983 1726882973.31530: checking for max_fail_percentage 28983 1726882973.31532: done checking for max_fail_percentage 28983 1726882973.31535: checking to see if all hosts have failed and the running result is not ok 28983 1726882973.31536: done checking to see if all hosts have failed 28983 1726882973.31537: getting the remaining hosts for this loop 28983 1726882973.31539: done getting the remaining hosts for this loop 28983 1726882973.31542: getting the next task for host managed_node2 28983 1726882973.31555: done getting next task for host managed_node2 28983 1726882973.31558: ^ task is: TASK: Set network provider to 'nm' 28983 1726882973.31560: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882973.31564: getting variables 28983 1726882973.31565: in VariableManager get_vars() 28983 1726882973.31591: Calling all_inventory to load vars for managed_node2 28983 1726882973.31594: Calling groups_inventory to load vars for managed_node2 28983 1726882973.31598: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882973.31608: Calling all_plugins_play to load vars for managed_node2 28983 1726882973.31611: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882973.31615: Calling groups_plugins_play to load vars for managed_node2 28983 1726882973.31989: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882973.32319: done with get_vars() 28983 1726882973.32330: done getting variables 28983 1726882973.32395: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tests_states_nm.yml:13 Friday 20 September 2024 21:42:53 -0400 (0:00:00.026) 0:00:03.322 ****** 28983 1726882973.32430: entering _queue_task() for managed_node2/set_fact 28983 1726882973.32774: worker is 1 (out of 1 available) 28983 1726882973.32787: exiting _queue_task() for managed_node2/set_fact 28983 1726882973.32798: done queuing things up, now waiting for results queue to drain 28983 1726882973.32800: waiting for pending results... 28983 1726882973.33053: running TaskExecutor() for managed_node2/TASK: Set network provider to 'nm' 28983 1726882973.33059: in run() - task 0affe814-3a2d-b16d-c0a7-000000000007 28983 1726882973.33077: variable 'ansible_search_path' from source: unknown 28983 1726882973.33119: calling self._execute() 28983 1726882973.33216: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882973.33229: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882973.33250: variable 'omit' from source: magic vars 28983 1726882973.33380: variable 'omit' from source: magic vars 28983 1726882973.33426: variable 'omit' from source: magic vars 28983 1726882973.33505: variable 'omit' from source: magic vars 28983 1726882973.33530: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726882973.33580: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726882973.33617: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726882973.33646: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882973.33723: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882973.33727: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726882973.33729: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882973.33731: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882973.33852: Set connection var ansible_connection to ssh 28983 1726882973.33871: Set connection var ansible_shell_executable to /bin/sh 28983 1726882973.33887: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726882973.33903: Set connection var ansible_timeout to 10 28983 1726882973.33916: Set connection var ansible_pipelining to False 28983 1726882973.33924: Set connection var ansible_shell_type to sh 28983 1726882973.33964: variable 'ansible_shell_executable' from source: unknown 28983 1726882973.33973: variable 'ansible_connection' from source: unknown 28983 1726882973.33982: variable 'ansible_module_compression' from source: unknown 28983 1726882973.34052: variable 'ansible_shell_type' from source: unknown 28983 1726882973.34055: variable 'ansible_shell_executable' from source: unknown 28983 1726882973.34060: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882973.34063: variable 'ansible_pipelining' from source: unknown 28983 1726882973.34065: variable 'ansible_timeout' from source: unknown 28983 1726882973.34067: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882973.34270: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726882973.34274: variable 'omit' from source: magic vars 28983 1726882973.34276: starting attempt loop 28983 1726882973.34278: running the handler 28983 1726882973.34281: handler run complete 28983 1726882973.34283: attempt loop complete, returning result 28983 1726882973.34290: _execute() done 28983 1726882973.34293: dumping result to json 28983 1726882973.34295: done dumping result, returning 28983 1726882973.34306: done running TaskExecutor() for managed_node2/TASK: Set network provider to 'nm' [0affe814-3a2d-b16d-c0a7-000000000007] 28983 1726882973.34316: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000007 ok: [managed_node2] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 28983 1726882973.34679: no more pending results, returning what we have 28983 1726882973.34682: results queue empty 28983 1726882973.34684: checking for any_errors_fatal 28983 1726882973.34688: done checking for any_errors_fatal 28983 1726882973.34689: checking for max_fail_percentage 28983 1726882973.34691: done checking for max_fail_percentage 28983 1726882973.34692: checking to see if all hosts have failed and the running result is not ok 28983 1726882973.34693: done checking to see if all hosts have failed 28983 1726882973.34694: getting the remaining hosts for this loop 28983 1726882973.34695: done getting the remaining hosts for this loop 28983 1726882973.34699: getting the next task for host managed_node2 28983 1726882973.34705: done getting next task for host managed_node2 28983 1726882973.34707: ^ task is: TASK: meta (flush_handlers) 28983 1726882973.34709: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882973.34713: getting variables 28983 1726882973.34714: in VariableManager get_vars() 28983 1726882973.34748: Calling all_inventory to load vars for managed_node2 28983 1726882973.34751: Calling groups_inventory to load vars for managed_node2 28983 1726882973.34754: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882973.34761: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000007 28983 1726882973.34764: WORKER PROCESS EXITING 28983 1726882973.34772: Calling all_plugins_play to load vars for managed_node2 28983 1726882973.34776: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882973.34780: Calling groups_plugins_play to load vars for managed_node2 28983 1726882973.35035: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882973.35365: done with get_vars() 28983 1726882973.35376: done getting variables 28983 1726882973.35454: in VariableManager get_vars() 28983 1726882973.35465: Calling all_inventory to load vars for managed_node2 28983 1726882973.35467: Calling groups_inventory to load vars for managed_node2 28983 1726882973.35471: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882973.35476: Calling all_plugins_play to load vars for managed_node2 28983 1726882973.35479: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882973.35482: Calling groups_plugins_play to load vars for managed_node2 28983 1726882973.35740: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882973.36073: done with get_vars() 28983 1726882973.36089: done queuing things up, now waiting for results queue to drain 28983 1726882973.36092: results queue empty 28983 1726882973.36093: checking for any_errors_fatal 28983 1726882973.36095: done checking for any_errors_fatal 28983 1726882973.36096: checking for max_fail_percentage 28983 1726882973.36098: done checking for max_fail_percentage 28983 1726882973.36098: checking to see if all hosts have failed and the running result is not ok 28983 1726882973.36099: done checking to see if all hosts have failed 28983 1726882973.36100: getting the remaining hosts for this loop 28983 1726882973.36102: done getting the remaining hosts for this loop 28983 1726882973.36104: getting the next task for host managed_node2 28983 1726882973.36109: done getting next task for host managed_node2 28983 1726882973.36110: ^ task is: TASK: meta (flush_handlers) 28983 1726882973.36112: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882973.36120: getting variables 28983 1726882973.36122: in VariableManager get_vars() 28983 1726882973.36131: Calling all_inventory to load vars for managed_node2 28983 1726882973.36135: Calling groups_inventory to load vars for managed_node2 28983 1726882973.36138: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882973.36144: Calling all_plugins_play to load vars for managed_node2 28983 1726882973.36147: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882973.36151: Calling groups_plugins_play to load vars for managed_node2 28983 1726882973.36378: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882973.36704: done with get_vars() 28983 1726882973.36713: done getting variables 28983 1726882973.36767: in VariableManager get_vars() 28983 1726882973.36777: Calling all_inventory to load vars for managed_node2 28983 1726882973.36779: Calling groups_inventory to load vars for managed_node2 28983 1726882973.36782: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882973.36787: Calling all_plugins_play to load vars for managed_node2 28983 1726882973.36790: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882973.36794: Calling groups_plugins_play to load vars for managed_node2 28983 1726882973.37046: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882973.37365: done with get_vars() 28983 1726882973.37379: done queuing things up, now waiting for results queue to drain 28983 1726882973.37381: results queue empty 28983 1726882973.37382: checking for any_errors_fatal 28983 1726882973.37384: done checking for any_errors_fatal 28983 1726882973.37385: checking for max_fail_percentage 28983 1726882973.37386: done checking for max_fail_percentage 28983 1726882973.37387: checking to see if all hosts have failed and the running result is not ok 28983 1726882973.37388: done checking to see if all hosts have failed 28983 1726882973.37389: getting the remaining hosts for this loop 28983 1726882973.37390: done getting the remaining hosts for this loop 28983 1726882973.37393: getting the next task for host managed_node2 28983 1726882973.37396: done getting next task for host managed_node2 28983 1726882973.37397: ^ task is: None 28983 1726882973.37399: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882973.37400: done queuing things up, now waiting for results queue to drain 28983 1726882973.37401: results queue empty 28983 1726882973.37402: checking for any_errors_fatal 28983 1726882973.37403: done checking for any_errors_fatal 28983 1726882973.37404: checking for max_fail_percentage 28983 1726882973.37405: done checking for max_fail_percentage 28983 1726882973.37406: checking to see if all hosts have failed and the running result is not ok 28983 1726882973.37407: done checking to see if all hosts have failed 28983 1726882973.37409: getting the next task for host managed_node2 28983 1726882973.37412: done getting next task for host managed_node2 28983 1726882973.37413: ^ task is: None 28983 1726882973.37415: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882973.37468: in VariableManager get_vars() 28983 1726882973.37485: done with get_vars() 28983 1726882973.37492: in VariableManager get_vars() 28983 1726882973.37503: done with get_vars() 28983 1726882973.37509: variable 'omit' from source: magic vars 28983 1726882973.37551: in VariableManager get_vars() 28983 1726882973.37563: done with get_vars() 28983 1726882973.37587: variable 'omit' from source: magic vars PLAY [Play for testing states] ************************************************* 28983 1726882973.37961: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 28983 1726882973.37993: getting the remaining hosts for this loop 28983 1726882973.37995: done getting the remaining hosts for this loop 28983 1726882973.37998: getting the next task for host managed_node2 28983 1726882973.38001: done getting next task for host managed_node2 28983 1726882973.38003: ^ task is: TASK: Gathering Facts 28983 1726882973.38005: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882973.38007: getting variables 28983 1726882973.38009: in VariableManager get_vars() 28983 1726882973.38018: Calling all_inventory to load vars for managed_node2 28983 1726882973.38021: Calling groups_inventory to load vars for managed_node2 28983 1726882973.38024: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882973.38029: Calling all_plugins_play to load vars for managed_node2 28983 1726882973.38047: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882973.38052: Calling groups_plugins_play to load vars for managed_node2 28983 1726882973.38320: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882973.38640: done with get_vars() 28983 1726882973.38651: done getting variables 28983 1726882973.38695: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_states.yml:3 Friday 20 September 2024 21:42:53 -0400 (0:00:00.062) 0:00:03.384 ****** 28983 1726882973.38720: entering _queue_task() for managed_node2/gather_facts 28983 1726882973.39197: worker is 1 (out of 1 available) 28983 1726882973.39210: exiting _queue_task() for managed_node2/gather_facts 28983 1726882973.39223: done queuing things up, now waiting for results queue to drain 28983 1726882973.39225: waiting for pending results... 28983 1726882973.39805: running TaskExecutor() for managed_node2/TASK: Gathering Facts 28983 1726882973.39941: in run() - task 0affe814-3a2d-b16d-c0a7-000000000077 28983 1726882973.39946: variable 'ansible_search_path' from source: unknown 28983 1726882973.40164: calling self._execute() 28983 1726882973.40325: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882973.40330: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882973.40333: variable 'omit' from source: magic vars 28983 1726882973.41217: variable 'ansible_distribution_major_version' from source: facts 28983 1726882973.41272: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882973.41291: variable 'omit' from source: magic vars 28983 1726882973.41328: variable 'omit' from source: magic vars 28983 1726882973.41383: variable 'omit' from source: magic vars 28983 1726882973.41444: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726882973.41499: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726882973.41541: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726882973.41640: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882973.41643: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882973.41646: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726882973.41649: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882973.41651: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882973.41781: Set connection var ansible_connection to ssh 28983 1726882973.41803: Set connection var ansible_shell_executable to /bin/sh 28983 1726882973.41821: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726882973.41842: Set connection var ansible_timeout to 10 28983 1726882973.41856: Set connection var ansible_pipelining to False 28983 1726882973.41864: Set connection var ansible_shell_type to sh 28983 1726882973.41902: variable 'ansible_shell_executable' from source: unknown 28983 1726882973.41940: variable 'ansible_connection' from source: unknown 28983 1726882973.41943: variable 'ansible_module_compression' from source: unknown 28983 1726882973.41946: variable 'ansible_shell_type' from source: unknown 28983 1726882973.41949: variable 'ansible_shell_executable' from source: unknown 28983 1726882973.41951: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882973.41953: variable 'ansible_pipelining' from source: unknown 28983 1726882973.41956: variable 'ansible_timeout' from source: unknown 28983 1726882973.41962: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882973.42207: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726882973.42312: variable 'omit' from source: magic vars 28983 1726882973.42316: starting attempt loop 28983 1726882973.42319: running the handler 28983 1726882973.42322: variable 'ansible_facts' from source: unknown 28983 1726882973.42324: _low_level_execute_command(): starting 28983 1726882973.42326: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726882973.43151: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882973.43214: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726882973.43240: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726882973.43584: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882973.43627: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882973.45427: stdout chunk (state=3): >>>/root <<< 28983 1726882973.45550: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882973.45619: stderr chunk (state=3): >>><<< 28983 1726882973.45629: stdout chunk (state=3): >>><<< 28983 1726882973.45665: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726882973.45769: _low_level_execute_command(): starting 28983 1726882973.45775: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882973.4567297-29131-266777482760918 `" && echo ansible-tmp-1726882973.4567297-29131-266777482760918="` echo /root/.ansible/tmp/ansible-tmp-1726882973.4567297-29131-266777482760918 `" ) && sleep 0' 28983 1726882973.46315: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726882973.46359: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726882973.46452: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882973.46484: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726882973.46502: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726882973.46528: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882973.46625: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882973.48735: stdout chunk (state=3): >>>ansible-tmp-1726882973.4567297-29131-266777482760918=/root/.ansible/tmp/ansible-tmp-1726882973.4567297-29131-266777482760918 <<< 28983 1726882973.48927: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882973.48930: stdout chunk (state=3): >>><<< 28983 1726882973.48933: stderr chunk (state=3): >>><<< 28983 1726882973.49046: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882973.4567297-29131-266777482760918=/root/.ansible/tmp/ansible-tmp-1726882973.4567297-29131-266777482760918 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726882973.49051: variable 'ansible_module_compression' from source: unknown 28983 1726882973.49053: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 28983 1726882973.49111: variable 'ansible_facts' from source: unknown 28983 1726882973.49302: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882973.4567297-29131-266777482760918/AnsiballZ_setup.py 28983 1726882973.49521: Sending initial data 28983 1726882973.49525: Sent initial data (154 bytes) 28983 1726882973.50153: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28983 1726882973.50170: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882973.50197: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found <<< 28983 1726882973.50293: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726882973.50314: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882973.50423: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882973.52189: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726882973.52267: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726882973.52346: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmp87cr6zlw /root/.ansible/tmp/ansible-tmp-1726882973.4567297-29131-266777482760918/AnsiballZ_setup.py <<< 28983 1726882973.52350: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882973.4567297-29131-266777482760918/AnsiballZ_setup.py" <<< 28983 1726882973.52430: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmp87cr6zlw" to remote "/root/.ansible/tmp/ansible-tmp-1726882973.4567297-29131-266777482760918/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882973.4567297-29131-266777482760918/AnsiballZ_setup.py" <<< 28983 1726882973.56181: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882973.56398: stdout chunk (state=3): >>><<< 28983 1726882973.56402: stderr chunk (state=3): >>><<< 28983 1726882973.56404: done transferring module to remote 28983 1726882973.56406: _low_level_execute_command(): starting 28983 1726882973.56408: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882973.4567297-29131-266777482760918/ /root/.ansible/tmp/ansible-tmp-1726882973.4567297-29131-266777482760918/AnsiballZ_setup.py && sleep 0' 28983 1726882973.57124: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726882973.57225: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882973.57270: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726882973.57295: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726882973.57345: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882973.57408: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882973.59650: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882973.59654: stdout chunk (state=3): >>><<< 28983 1726882973.59656: stderr chunk (state=3): >>><<< 28983 1726882973.59674: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726882973.59692: _low_level_execute_command(): starting 28983 1726882973.60021: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882973.4567297-29131-266777482760918/AnsiballZ_setup.py && sleep 0' 28983 1726882973.61162: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726882973.61172: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726882973.61187: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882973.61205: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726882973.61438: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882973.61465: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726882973.61482: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726882973.61495: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882973.61598: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882974.32158: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAMAGmz4coceQATe4wVCPswNHKDq397sHN0wP3lxcDx5YXYj3mfO2mlh1Dpa7TCdRgLKwtozxXO6KafFlS3d0x9UWzSyKR0sSc77mhua/o3Y8EThq+wmVYqwwMQp1Vh8aBTvONV8N1UHqLp3aOdJIjHMGYdoUzUEF7xedcrV0fOV9AAAAFQDiC9S/VmOYdv/C8sXiFstIvsP/FQAAAIEApapvkLljxqN9GCi5UqXohiznCnndWFY9Vt/4wN+GtUjkuNJBqYHErEZCKfujpgVR94wM4sP3DbiJkL+OurGNHPJn7qrXDGQNIKExN7q3EzJI6yKBYdq1pnuhK1fBE/B8I/GQAEoqP3PMoutNlf85wWVgmt1DBc+D9D87BEGZzFoAAACBAIyk6Zb39dUz0T2fpmnSTF7AJHxsuBXwGZH1/5c5tWS0QGhwu5nzEoJUkQLhk+JqFJVRjNKoZ8wzH8N32ZrE15HfLF6/uIlfBorDH5AhDSnVumVmGZtYAerr8Cch5xqDXZSHTUhi7nBmdY/IKTgk7lCs0q4c7ja/wOueEHXkfWdF", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDjr2n9pN3Skd0p3fTYngf1LkuFARyHo/RDL4n6yhKcuFMhqe9MjqKkyMOvWaeNvTMkAiQ7TNEROz2mBqoPSot74UYaDR1nw9xQRq7skd9l/L6FlWrbg6EBCcQZUgkgjucBgmk3+INE0QVUdywVyW2IdmqayH/fLojViFulCLWlWn9cjFclC/t+sfMfoY6DrRIoi1GlfdlEfEHT9zGqC5syJrp6Yb3b9Ho/CYNAXya6aAHzMTLkx0/kU4czCptGZ9ew7HWLOtMv8iahxGrAp1VW4jj76+SZ5OisJ9N7+g8GPnaNAvsDNldGNQJWME6YNcEbxHblmHAEU0lq9EydM2W5iQHUnSezOSqQBljsiUACwwxZSphsqFYQnsHv4Vl/NlVTAJOApkU0VehWPUtOQNiqG+W/VGFMqqBksxF5tVDTO+qkvF5bm8JT2RSHAIbRpPPOYA8fg1PEPu1ONXD99Jn3urd0Y0kvUfp2NzPk1JFbxcGh+uDNHR2t5bVuOyRvmy0=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKj0SRB0MBwzo3XKoUBfi6MCOa8n8Z6sjosikvEKYLTWy/hzFaSt2hhtv0qoPi/CAERuCNgGQ5pZPiqBpnr9C8A=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIHVmW9cqR4t5U02ebXgqIiDjJ0aeuxmuwOiTXv538jBQ", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_loadavg": {"1m": 0.9208984375, "5m": 0.87060546875, "15m": 0.48486328125}, "ansible_iscsi_iqn": "", "ansible_system": "Linux", "ansible_kernel": "6.10.9-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 9 02:28:01 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-46-139.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-46-139", "ansible_nodename": "ip-10-31-46-139.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2c52174af731fc996c81a6a9338a65", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "42", "second": "53", "epoch": "1726882973", "epoch_int": "1726882973", "date": "2024-09-20", "time": "21:42:53", "iso8601_micro": "2024-09-21T01:42:53.926705Z", "iso8601": "2024-09-21T01:42:53Z", "iso8601_basic": "20240920T214253926705", "iso8601_basic_short": "20240920T214253", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_lsb": {}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_apparmor": {"status": "disabled"}, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "L<<< 28983 1726882974.32190: stdout chunk (state=3): >>>S_COLORS": "", "SSH_CONNECTION": "10.31.14.145 47942 10.31.46.139 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.145 47942 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_is_chroot": false, "ansible_fips": false, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2836, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 881, "free": 2836}, "nocache": {"free": 3457, "used": 260}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2c5217-4af7-31fc-996c-81a6a9338a65", "ansible_product_uuid": "ec2c5217-4af7-31fc-996c-81a6a9338a65", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["f92a5a40-e33d-4a6f-8746-997eff27cfbd"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "f92a5a40-e33d-4a6f-8746-997eff27cfbd", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["f92a5a40-e33d-4a6f-8746-997eff27cfbd"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 937, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264124022784, "size_available": 251199434752, "block_size": 4096, "block_total": 64483404, "block_available": 61327987, "block_used": 3155417, "inode_total": 16384000, "inode_available": 16303512, "inode_used": 80488, "uuid": "f92a5a40-e33d-4a6f-8746-997eff27cfbd"}], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-100.fc39.x86_64", "root": "UUID=f92a5a40-e33d-4a6f-8746-997eff27cfbd", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200<<< 28983 1726882974.32226: stdout chunk (state=3): >>>n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-100.fc39.x86_64", "root": "UUID=f92a5a40-e33d-4a6f-8746-997eff27cfbd", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_fibre_channel_wwn": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "02:98:65:d3:42:6b", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.46.139", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::2f3a:b84:7c06:1e06", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offlo<<< 28983 1726882974.32235: stdout chunk (state=3): >>>ad": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.46.139", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:98:65:d3:42:6b", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.46.139"], "ansible_all_ipv6_addresses": ["fe80::2f3a:b84:7c06:1e06"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.46.139", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::2f3a:b84:7c06:1e06"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_hostnqn": "", "ansible_pkg_mgr": "dnf", "ansible_local": {}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 28983 1726882974.34242: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. <<< 28983 1726882974.34299: stderr chunk (state=3): >>><<< 28983 1726882974.34303: stdout chunk (state=3): >>><<< 28983 1726882974.34335: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAMAGmz4coceQATe4wVCPswNHKDq397sHN0wP3lxcDx5YXYj3mfO2mlh1Dpa7TCdRgLKwtozxXO6KafFlS3d0x9UWzSyKR0sSc77mhua/o3Y8EThq+wmVYqwwMQp1Vh8aBTvONV8N1UHqLp3aOdJIjHMGYdoUzUEF7xedcrV0fOV9AAAAFQDiC9S/VmOYdv/C8sXiFstIvsP/FQAAAIEApapvkLljxqN9GCi5UqXohiznCnndWFY9Vt/4wN+GtUjkuNJBqYHErEZCKfujpgVR94wM4sP3DbiJkL+OurGNHPJn7qrXDGQNIKExN7q3EzJI6yKBYdq1pnuhK1fBE/B8I/GQAEoqP3PMoutNlf85wWVgmt1DBc+D9D87BEGZzFoAAACBAIyk6Zb39dUz0T2fpmnSTF7AJHxsuBXwGZH1/5c5tWS0QGhwu5nzEoJUkQLhk+JqFJVRjNKoZ8wzH8N32ZrE15HfLF6/uIlfBorDH5AhDSnVumVmGZtYAerr8Cch5xqDXZSHTUhi7nBmdY/IKTgk7lCs0q4c7ja/wOueEHXkfWdF", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDjr2n9pN3Skd0p3fTYngf1LkuFARyHo/RDL4n6yhKcuFMhqe9MjqKkyMOvWaeNvTMkAiQ7TNEROz2mBqoPSot74UYaDR1nw9xQRq7skd9l/L6FlWrbg6EBCcQZUgkgjucBgmk3+INE0QVUdywVyW2IdmqayH/fLojViFulCLWlWn9cjFclC/t+sfMfoY6DrRIoi1GlfdlEfEHT9zGqC5syJrp6Yb3b9Ho/CYNAXya6aAHzMTLkx0/kU4czCptGZ9ew7HWLOtMv8iahxGrAp1VW4jj76+SZ5OisJ9N7+g8GPnaNAvsDNldGNQJWME6YNcEbxHblmHAEU0lq9EydM2W5iQHUnSezOSqQBljsiUACwwxZSphsqFYQnsHv4Vl/NlVTAJOApkU0VehWPUtOQNiqG+W/VGFMqqBksxF5tVDTO+qkvF5bm8JT2RSHAIbRpPPOYA8fg1PEPu1ONXD99Jn3urd0Y0kvUfp2NzPk1JFbxcGh+uDNHR2t5bVuOyRvmy0=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKj0SRB0MBwzo3XKoUBfi6MCOa8n8Z6sjosikvEKYLTWy/hzFaSt2hhtv0qoPi/CAERuCNgGQ5pZPiqBpnr9C8A=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIHVmW9cqR4t5U02ebXgqIiDjJ0aeuxmuwOiTXv538jBQ", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_loadavg": {"1m": 0.9208984375, "5m": 0.87060546875, "15m": 0.48486328125}, "ansible_iscsi_iqn": "", "ansible_system": "Linux", "ansible_kernel": "6.10.9-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 9 02:28:01 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-46-139.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-46-139", "ansible_nodename": "ip-10-31-46-139.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2c52174af731fc996c81a6a9338a65", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "42", "second": "53", "epoch": "1726882973", "epoch_int": "1726882973", "date": "2024-09-20", "time": "21:42:53", "iso8601_micro": "2024-09-21T01:42:53.926705Z", "iso8601": "2024-09-21T01:42:53Z", "iso8601_basic": "20240920T214253926705", "iso8601_basic_short": "20240920T214253", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_lsb": {}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_apparmor": {"status": "disabled"}, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.145 47942 10.31.46.139 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.145 47942 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_is_chroot": false, "ansible_fips": false, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2836, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 881, "free": 2836}, "nocache": {"free": 3457, "used": 260}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2c5217-4af7-31fc-996c-81a6a9338a65", "ansible_product_uuid": "ec2c5217-4af7-31fc-996c-81a6a9338a65", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["f92a5a40-e33d-4a6f-8746-997eff27cfbd"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "f92a5a40-e33d-4a6f-8746-997eff27cfbd", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["f92a5a40-e33d-4a6f-8746-997eff27cfbd"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 937, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264124022784, "size_available": 251199434752, "block_size": 4096, "block_total": 64483404, "block_available": 61327987, "block_used": 3155417, "inode_total": 16384000, "inode_available": 16303512, "inode_used": 80488, "uuid": "f92a5a40-e33d-4a6f-8746-997eff27cfbd"}], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-100.fc39.x86_64", "root": "UUID=f92a5a40-e33d-4a6f-8746-997eff27cfbd", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-100.fc39.x86_64", "root": "UUID=f92a5a40-e33d-4a6f-8746-997eff27cfbd", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_fibre_channel_wwn": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "02:98:65:d3:42:6b", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.46.139", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::2f3a:b84:7c06:1e06", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.46.139", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:98:65:d3:42:6b", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.46.139"], "ansible_all_ipv6_addresses": ["fe80::2f3a:b84:7c06:1e06"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.46.139", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::2f3a:b84:7c06:1e06"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_hostnqn": "", "ansible_pkg_mgr": "dnf", "ansible_local": {}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726882974.34654: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882973.4567297-29131-266777482760918/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726882974.34675: _low_level_execute_command(): starting 28983 1726882974.34679: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882973.4567297-29131-266777482760918/ > /dev/null 2>&1 && sleep 0' 28983 1726882974.35115: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882974.35120: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882974.35123: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726882974.35126: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882974.35177: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726882974.35184: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882974.35255: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882974.37160: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882974.37206: stderr chunk (state=3): >>><<< 28983 1726882974.37209: stdout chunk (state=3): >>><<< 28983 1726882974.37222: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726882974.37231: handler run complete 28983 1726882974.37351: variable 'ansible_facts' from source: unknown 28983 1726882974.37445: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882974.37728: variable 'ansible_facts' from source: unknown 28983 1726882974.37802: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882974.37919: attempt loop complete, returning result 28983 1726882974.37924: _execute() done 28983 1726882974.37928: dumping result to json 28983 1726882974.37959: done dumping result, returning 28983 1726882974.37967: done running TaskExecutor() for managed_node2/TASK: Gathering Facts [0affe814-3a2d-b16d-c0a7-000000000077] 28983 1726882974.37973: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000077 28983 1726882974.38276: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000077 28983 1726882974.38281: WORKER PROCESS EXITING ok: [managed_node2] 28983 1726882974.38586: no more pending results, returning what we have 28983 1726882974.38588: results queue empty 28983 1726882974.38589: checking for any_errors_fatal 28983 1726882974.38590: done checking for any_errors_fatal 28983 1726882974.38591: checking for max_fail_percentage 28983 1726882974.38592: done checking for max_fail_percentage 28983 1726882974.38593: checking to see if all hosts have failed and the running result is not ok 28983 1726882974.38593: done checking to see if all hosts have failed 28983 1726882974.38594: getting the remaining hosts for this loop 28983 1726882974.38595: done getting the remaining hosts for this loop 28983 1726882974.38598: getting the next task for host managed_node2 28983 1726882974.38602: done getting next task for host managed_node2 28983 1726882974.38603: ^ task is: TASK: meta (flush_handlers) 28983 1726882974.38605: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882974.38608: getting variables 28983 1726882974.38609: in VariableManager get_vars() 28983 1726882974.38627: Calling all_inventory to load vars for managed_node2 28983 1726882974.38629: Calling groups_inventory to load vars for managed_node2 28983 1726882974.38631: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882974.38641: Calling all_plugins_play to load vars for managed_node2 28983 1726882974.38644: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882974.38647: Calling groups_plugins_play to load vars for managed_node2 28983 1726882974.38788: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882974.38960: done with get_vars() 28983 1726882974.38968: done getting variables 28983 1726882974.39025: in VariableManager get_vars() 28983 1726882974.39033: Calling all_inventory to load vars for managed_node2 28983 1726882974.39037: Calling groups_inventory to load vars for managed_node2 28983 1726882974.39039: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882974.39043: Calling all_plugins_play to load vars for managed_node2 28983 1726882974.39045: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882974.39047: Calling groups_plugins_play to load vars for managed_node2 28983 1726882974.39177: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882974.39347: done with get_vars() 28983 1726882974.39358: done queuing things up, now waiting for results queue to drain 28983 1726882974.39360: results queue empty 28983 1726882974.39360: checking for any_errors_fatal 28983 1726882974.39363: done checking for any_errors_fatal 28983 1726882974.39363: checking for max_fail_percentage 28983 1726882974.39364: done checking for max_fail_percentage 28983 1726882974.39365: checking to see if all hosts have failed and the running result is not ok 28983 1726882974.39368: done checking to see if all hosts have failed 28983 1726882974.39369: getting the remaining hosts for this loop 28983 1726882974.39370: done getting the remaining hosts for this loop 28983 1726882974.39372: getting the next task for host managed_node2 28983 1726882974.39374: done getting next task for host managed_node2 28983 1726882974.39376: ^ task is: TASK: Show playbook name 28983 1726882974.39377: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882974.39379: getting variables 28983 1726882974.39380: in VariableManager get_vars() 28983 1726882974.39386: Calling all_inventory to load vars for managed_node2 28983 1726882974.39388: Calling groups_inventory to load vars for managed_node2 28983 1726882974.39389: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882974.39393: Calling all_plugins_play to load vars for managed_node2 28983 1726882974.39395: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882974.39397: Calling groups_plugins_play to load vars for managed_node2 28983 1726882974.39518: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882974.39685: done with get_vars() 28983 1726882974.39692: done getting variables 28983 1726882974.39756: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Show playbook name] ****************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_states.yml:11 Friday 20 September 2024 21:42:54 -0400 (0:00:01.010) 0:00:04.395 ****** 28983 1726882974.39777: entering _queue_task() for managed_node2/debug 28983 1726882974.39779: Creating lock for debug 28983 1726882974.39987: worker is 1 (out of 1 available) 28983 1726882974.40002: exiting _queue_task() for managed_node2/debug 28983 1726882974.40012: done queuing things up, now waiting for results queue to drain 28983 1726882974.40014: waiting for pending results... 28983 1726882974.40165: running TaskExecutor() for managed_node2/TASK: Show playbook name 28983 1726882974.40227: in run() - task 0affe814-3a2d-b16d-c0a7-00000000000b 28983 1726882974.40244: variable 'ansible_search_path' from source: unknown 28983 1726882974.40277: calling self._execute() 28983 1726882974.40341: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882974.40353: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882974.40362: variable 'omit' from source: magic vars 28983 1726882974.40665: variable 'ansible_distribution_major_version' from source: facts 28983 1726882974.40681: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882974.40686: variable 'omit' from source: magic vars 28983 1726882974.40708: variable 'omit' from source: magic vars 28983 1726882974.40737: variable 'omit' from source: magic vars 28983 1726882974.40771: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726882974.40810: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726882974.40826: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726882974.40844: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882974.40854: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882974.40881: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726882974.40891: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882974.40894: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882974.40974: Set connection var ansible_connection to ssh 28983 1726882974.40987: Set connection var ansible_shell_executable to /bin/sh 28983 1726882974.40995: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726882974.41005: Set connection var ansible_timeout to 10 28983 1726882974.41015: Set connection var ansible_pipelining to False 28983 1726882974.41018: Set connection var ansible_shell_type to sh 28983 1726882974.41037: variable 'ansible_shell_executable' from source: unknown 28983 1726882974.41040: variable 'ansible_connection' from source: unknown 28983 1726882974.41043: variable 'ansible_module_compression' from source: unknown 28983 1726882974.41046: variable 'ansible_shell_type' from source: unknown 28983 1726882974.41051: variable 'ansible_shell_executable' from source: unknown 28983 1726882974.41055: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882974.41060: variable 'ansible_pipelining' from source: unknown 28983 1726882974.41062: variable 'ansible_timeout' from source: unknown 28983 1726882974.41068: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882974.41243: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726882974.41253: variable 'omit' from source: magic vars 28983 1726882974.41260: starting attempt loop 28983 1726882974.41263: running the handler 28983 1726882974.41306: handler run complete 28983 1726882974.41326: attempt loop complete, returning result 28983 1726882974.41329: _execute() done 28983 1726882974.41333: dumping result to json 28983 1726882974.41338: done dumping result, returning 28983 1726882974.41351: done running TaskExecutor() for managed_node2/TASK: Show playbook name [0affe814-3a2d-b16d-c0a7-00000000000b] 28983 1726882974.41358: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000000b 28983 1726882974.41447: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000000b 28983 1726882974.41450: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: this is: playbooks/tests_states.yml 28983 1726882974.41506: no more pending results, returning what we have 28983 1726882974.41509: results queue empty 28983 1726882974.41510: checking for any_errors_fatal 28983 1726882974.41513: done checking for any_errors_fatal 28983 1726882974.41513: checking for max_fail_percentage 28983 1726882974.41515: done checking for max_fail_percentage 28983 1726882974.41516: checking to see if all hosts have failed and the running result is not ok 28983 1726882974.41517: done checking to see if all hosts have failed 28983 1726882974.41518: getting the remaining hosts for this loop 28983 1726882974.41519: done getting the remaining hosts for this loop 28983 1726882974.41523: getting the next task for host managed_node2 28983 1726882974.41528: done getting next task for host managed_node2 28983 1726882974.41532: ^ task is: TASK: Include the task 'run_test.yml' 28983 1726882974.41536: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882974.41539: getting variables 28983 1726882974.41540: in VariableManager get_vars() 28983 1726882974.41567: Calling all_inventory to load vars for managed_node2 28983 1726882974.41569: Calling groups_inventory to load vars for managed_node2 28983 1726882974.41573: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882974.41583: Calling all_plugins_play to load vars for managed_node2 28983 1726882974.41587: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882974.41590: Calling groups_plugins_play to load vars for managed_node2 28983 1726882974.41769: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882974.41943: done with get_vars() 28983 1726882974.41950: done getting variables TASK [Include the task 'run_test.yml'] ***************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_states.yml:22 Friday 20 September 2024 21:42:54 -0400 (0:00:00.022) 0:00:04.417 ****** 28983 1726882974.42016: entering _queue_task() for managed_node2/include_tasks 28983 1726882974.42207: worker is 1 (out of 1 available) 28983 1726882974.42219: exiting _queue_task() for managed_node2/include_tasks 28983 1726882974.42229: done queuing things up, now waiting for results queue to drain 28983 1726882974.42231: waiting for pending results... 28983 1726882974.42370: running TaskExecutor() for managed_node2/TASK: Include the task 'run_test.yml' 28983 1726882974.42427: in run() - task 0affe814-3a2d-b16d-c0a7-00000000000d 28983 1726882974.42442: variable 'ansible_search_path' from source: unknown 28983 1726882974.42471: calling self._execute() 28983 1726882974.42533: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882974.42541: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882974.42550: variable 'omit' from source: magic vars 28983 1726882974.42837: variable 'ansible_distribution_major_version' from source: facts 28983 1726882974.42848: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882974.42854: _execute() done 28983 1726882974.42857: dumping result to json 28983 1726882974.42862: done dumping result, returning 28983 1726882974.42869: done running TaskExecutor() for managed_node2/TASK: Include the task 'run_test.yml' [0affe814-3a2d-b16d-c0a7-00000000000d] 28983 1726882974.42874: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000000d 28983 1726882974.42988: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000000d 28983 1726882974.42991: WORKER PROCESS EXITING 28983 1726882974.43024: no more pending results, returning what we have 28983 1726882974.43028: in VariableManager get_vars() 28983 1726882974.43058: Calling all_inventory to load vars for managed_node2 28983 1726882974.43061: Calling groups_inventory to load vars for managed_node2 28983 1726882974.43064: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882974.43073: Calling all_plugins_play to load vars for managed_node2 28983 1726882974.43076: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882974.43082: Calling groups_plugins_play to load vars for managed_node2 28983 1726882974.43230: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882974.43421: done with get_vars() 28983 1726882974.43427: variable 'ansible_search_path' from source: unknown 28983 1726882974.43440: we have included files to process 28983 1726882974.43441: generating all_blocks data 28983 1726882974.43443: done generating all_blocks data 28983 1726882974.43443: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 28983 1726882974.43444: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 28983 1726882974.43446: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 28983 1726882974.43860: in VariableManager get_vars() 28983 1726882974.43875: done with get_vars() 28983 1726882974.44068: in VariableManager get_vars() 28983 1726882974.44083: done with get_vars() 28983 1726882974.44116: in VariableManager get_vars() 28983 1726882974.44127: done with get_vars() 28983 1726882974.44159: in VariableManager get_vars() 28983 1726882974.44169: done with get_vars() 28983 1726882974.44203: in VariableManager get_vars() 28983 1726882974.44215: done with get_vars() 28983 1726882974.44507: in VariableManager get_vars() 28983 1726882974.44520: done with get_vars() 28983 1726882974.44531: done processing included file 28983 1726882974.44532: iterating over new_blocks loaded from include file 28983 1726882974.44533: in VariableManager get_vars() 28983 1726882974.44543: done with get_vars() 28983 1726882974.44544: filtering new block on tags 28983 1726882974.44625: done filtering new block on tags 28983 1726882974.44628: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml for managed_node2 28983 1726882974.44633: extending task lists for all hosts with included blocks 28983 1726882974.44662: done extending task lists 28983 1726882974.44663: done processing included files 28983 1726882974.44663: results queue empty 28983 1726882974.44664: checking for any_errors_fatal 28983 1726882974.44666: done checking for any_errors_fatal 28983 1726882974.44667: checking for max_fail_percentage 28983 1726882974.44668: done checking for max_fail_percentage 28983 1726882974.44668: checking to see if all hosts have failed and the running result is not ok 28983 1726882974.44669: done checking to see if all hosts have failed 28983 1726882974.44669: getting the remaining hosts for this loop 28983 1726882974.44670: done getting the remaining hosts for this loop 28983 1726882974.44672: getting the next task for host managed_node2 28983 1726882974.44675: done getting next task for host managed_node2 28983 1726882974.44676: ^ task is: TASK: TEST: {{ lsr_description }} 28983 1726882974.44680: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882974.44682: getting variables 28983 1726882974.44682: in VariableManager get_vars() 28983 1726882974.44688: Calling all_inventory to load vars for managed_node2 28983 1726882974.44690: Calling groups_inventory to load vars for managed_node2 28983 1726882974.44692: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882974.44696: Calling all_plugins_play to load vars for managed_node2 28983 1726882974.44698: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882974.44700: Calling groups_plugins_play to load vars for managed_node2 28983 1726882974.44838: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882974.45012: done with get_vars() 28983 1726882974.45019: done getting variables 28983 1726882974.45050: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 28983 1726882974.45140: variable 'lsr_description' from source: include params TASK [TEST: I can create a profile] ******************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:5 Friday 20 September 2024 21:42:54 -0400 (0:00:00.031) 0:00:04.449 ****** 28983 1726882974.45169: entering _queue_task() for managed_node2/debug 28983 1726882974.45404: worker is 1 (out of 1 available) 28983 1726882974.45416: exiting _queue_task() for managed_node2/debug 28983 1726882974.45427: done queuing things up, now waiting for results queue to drain 28983 1726882974.45429: waiting for pending results... 28983 1726882974.45857: running TaskExecutor() for managed_node2/TASK: TEST: I can create a profile 28983 1726882974.45867: in run() - task 0affe814-3a2d-b16d-c0a7-000000000091 28983 1726882974.45870: variable 'ansible_search_path' from source: unknown 28983 1726882974.45873: variable 'ansible_search_path' from source: unknown 28983 1726882974.45908: calling self._execute() 28983 1726882974.46007: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882974.46023: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882974.46046: variable 'omit' from source: magic vars 28983 1726882974.46525: variable 'ansible_distribution_major_version' from source: facts 28983 1726882974.46529: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882974.46540: variable 'omit' from source: magic vars 28983 1726882974.46595: variable 'omit' from source: magic vars 28983 1726882974.46742: variable 'lsr_description' from source: include params 28983 1726882974.46853: variable 'omit' from source: magic vars 28983 1726882974.46857: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726882974.46867: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726882974.46898: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726882974.46925: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882974.46945: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882974.46993: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726882974.47003: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882974.47013: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882974.47142: Set connection var ansible_connection to ssh 28983 1726882974.47162: Set connection var ansible_shell_executable to /bin/sh 28983 1726882974.47187: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726882974.47204: Set connection var ansible_timeout to 10 28983 1726882974.47215: Set connection var ansible_pipelining to False 28983 1726882974.47224: Set connection var ansible_shell_type to sh 28983 1726882974.47256: variable 'ansible_shell_executable' from source: unknown 28983 1726882974.47269: variable 'ansible_connection' from source: unknown 28983 1726882974.47287: variable 'ansible_module_compression' from source: unknown 28983 1726882974.47397: variable 'ansible_shell_type' from source: unknown 28983 1726882974.47400: variable 'ansible_shell_executable' from source: unknown 28983 1726882974.47403: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882974.47406: variable 'ansible_pipelining' from source: unknown 28983 1726882974.47408: variable 'ansible_timeout' from source: unknown 28983 1726882974.47410: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882974.47516: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726882974.47540: variable 'omit' from source: magic vars 28983 1726882974.47554: starting attempt loop 28983 1726882974.47564: running the handler 28983 1726882974.47628: handler run complete 28983 1726882974.47656: attempt loop complete, returning result 28983 1726882974.47665: _execute() done 28983 1726882974.47675: dumping result to json 28983 1726882974.47690: done dumping result, returning 28983 1726882974.47705: done running TaskExecutor() for managed_node2/TASK: TEST: I can create a profile [0affe814-3a2d-b16d-c0a7-000000000091] 28983 1726882974.47720: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000091 ok: [managed_node2] => {} MSG: ########## I can create a profile ########## 28983 1726882974.47890: no more pending results, returning what we have 28983 1726882974.47894: results queue empty 28983 1726882974.47895: checking for any_errors_fatal 28983 1726882974.47897: done checking for any_errors_fatal 28983 1726882974.47898: checking for max_fail_percentage 28983 1726882974.47900: done checking for max_fail_percentage 28983 1726882974.47901: checking to see if all hosts have failed and the running result is not ok 28983 1726882974.47902: done checking to see if all hosts have failed 28983 1726882974.47903: getting the remaining hosts for this loop 28983 1726882974.47906: done getting the remaining hosts for this loop 28983 1726882974.47910: getting the next task for host managed_node2 28983 1726882974.47917: done getting next task for host managed_node2 28983 1726882974.47921: ^ task is: TASK: Show item 28983 1726882974.47926: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882974.47931: getting variables 28983 1726882974.47932: in VariableManager get_vars() 28983 1726882974.47965: Calling all_inventory to load vars for managed_node2 28983 1726882974.47969: Calling groups_inventory to load vars for managed_node2 28983 1726882974.47973: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882974.47987: Calling all_plugins_play to load vars for managed_node2 28983 1726882974.47992: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882974.47996: Calling groups_plugins_play to load vars for managed_node2 28983 1726882974.48440: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000091 28983 1726882974.48444: WORKER PROCESS EXITING 28983 1726882974.48469: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882974.48815: done with get_vars() 28983 1726882974.48827: done getting variables 28983 1726882974.48893: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show item] *************************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:9 Friday 20 September 2024 21:42:54 -0400 (0:00:00.037) 0:00:04.487 ****** 28983 1726882974.48924: entering _queue_task() for managed_node2/debug 28983 1726882974.49155: worker is 1 (out of 1 available) 28983 1726882974.49167: exiting _queue_task() for managed_node2/debug 28983 1726882974.49184: done queuing things up, now waiting for results queue to drain 28983 1726882974.49186: waiting for pending results... 28983 1726882974.49447: running TaskExecutor() for managed_node2/TASK: Show item 28983 1726882974.49560: in run() - task 0affe814-3a2d-b16d-c0a7-000000000092 28983 1726882974.49586: variable 'ansible_search_path' from source: unknown 28983 1726882974.49598: variable 'ansible_search_path' from source: unknown 28983 1726882974.49741: variable 'omit' from source: magic vars 28983 1726882974.49830: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882974.49852: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882974.49882: variable 'omit' from source: magic vars 28983 1726882974.50239: variable 'ansible_distribution_major_version' from source: facts 28983 1726882974.50251: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882974.50254: variable 'omit' from source: magic vars 28983 1726882974.50289: variable 'omit' from source: magic vars 28983 1726882974.50327: variable 'item' from source: unknown 28983 1726882974.50392: variable 'item' from source: unknown 28983 1726882974.50406: variable 'omit' from source: magic vars 28983 1726882974.50443: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726882974.50474: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726882974.50494: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726882974.50509: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882974.50520: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882974.50546: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726882974.50550: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882974.50555: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882974.50634: Set connection var ansible_connection to ssh 28983 1726882974.50645: Set connection var ansible_shell_executable to /bin/sh 28983 1726882974.50654: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726882974.50662: Set connection var ansible_timeout to 10 28983 1726882974.50669: Set connection var ansible_pipelining to False 28983 1726882974.50671: Set connection var ansible_shell_type to sh 28983 1726882974.50693: variable 'ansible_shell_executable' from source: unknown 28983 1726882974.50697: variable 'ansible_connection' from source: unknown 28983 1726882974.50699: variable 'ansible_module_compression' from source: unknown 28983 1726882974.50702: variable 'ansible_shell_type' from source: unknown 28983 1726882974.50708: variable 'ansible_shell_executable' from source: unknown 28983 1726882974.50711: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882974.50716: variable 'ansible_pipelining' from source: unknown 28983 1726882974.50719: variable 'ansible_timeout' from source: unknown 28983 1726882974.50724: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882974.50864: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726882974.50872: variable 'omit' from source: magic vars 28983 1726882974.50881: starting attempt loop 28983 1726882974.50884: running the handler 28983 1726882974.50922: variable 'lsr_description' from source: include params 28983 1726882974.50981: variable 'lsr_description' from source: include params 28983 1726882974.50988: handler run complete 28983 1726882974.51004: attempt loop complete, returning result 28983 1726882974.51019: variable 'item' from source: unknown 28983 1726882974.51071: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_description) => { "ansible_loop_var": "item", "item": "lsr_description", "lsr_description": "I can create a profile" } 28983 1726882974.51216: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882974.51219: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882974.51222: variable 'omit' from source: magic vars 28983 1726882974.51329: variable 'ansible_distribution_major_version' from source: facts 28983 1726882974.51340: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882974.51345: variable 'omit' from source: magic vars 28983 1726882974.51356: variable 'omit' from source: magic vars 28983 1726882974.51389: variable 'item' from source: unknown 28983 1726882974.51443: variable 'item' from source: unknown 28983 1726882974.51458: variable 'omit' from source: magic vars 28983 1726882974.51474: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726882974.51483: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882974.51489: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882974.51500: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726882974.51503: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882974.51508: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882974.51568: Set connection var ansible_connection to ssh 28983 1726882974.51577: Set connection var ansible_shell_executable to /bin/sh 28983 1726882974.51587: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726882974.51596: Set connection var ansible_timeout to 10 28983 1726882974.51602: Set connection var ansible_pipelining to False 28983 1726882974.51604: Set connection var ansible_shell_type to sh 28983 1726882974.51621: variable 'ansible_shell_executable' from source: unknown 28983 1726882974.51624: variable 'ansible_connection' from source: unknown 28983 1726882974.51627: variable 'ansible_module_compression' from source: unknown 28983 1726882974.51631: variable 'ansible_shell_type' from source: unknown 28983 1726882974.51637: variable 'ansible_shell_executable' from source: unknown 28983 1726882974.51641: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882974.51646: variable 'ansible_pipelining' from source: unknown 28983 1726882974.51649: variable 'ansible_timeout' from source: unknown 28983 1726882974.51659: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882974.51727: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726882974.51738: variable 'omit' from source: magic vars 28983 1726882974.51744: starting attempt loop 28983 1726882974.51746: running the handler 28983 1726882974.51768: variable 'lsr_setup' from source: include params 28983 1726882974.51822: variable 'lsr_setup' from source: include params 28983 1726882974.51857: handler run complete 28983 1726882974.51871: attempt loop complete, returning result 28983 1726882974.51886: variable 'item' from source: unknown 28983 1726882974.51943: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_setup) => { "ansible_loop_var": "item", "item": "lsr_setup", "lsr_setup": [ "tasks/delete_interface.yml", "tasks/assert_device_absent.yml" ] } 28983 1726882974.52043: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882974.52046: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882974.52056: variable 'omit' from source: magic vars 28983 1726882974.52180: variable 'ansible_distribution_major_version' from source: facts 28983 1726882974.52188: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882974.52192: variable 'omit' from source: magic vars 28983 1726882974.52205: variable 'omit' from source: magic vars 28983 1726882974.52238: variable 'item' from source: unknown 28983 1726882974.52294: variable 'item' from source: unknown 28983 1726882974.52306: variable 'omit' from source: magic vars 28983 1726882974.52321: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726882974.52328: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882974.52337: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882974.52347: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726882974.52350: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882974.52357: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882974.52539: Set connection var ansible_connection to ssh 28983 1726882974.52542: Set connection var ansible_shell_executable to /bin/sh 28983 1726882974.52545: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726882974.52547: Set connection var ansible_timeout to 10 28983 1726882974.52549: Set connection var ansible_pipelining to False 28983 1726882974.52551: Set connection var ansible_shell_type to sh 28983 1726882974.52554: variable 'ansible_shell_executable' from source: unknown 28983 1726882974.52556: variable 'ansible_connection' from source: unknown 28983 1726882974.52558: variable 'ansible_module_compression' from source: unknown 28983 1726882974.52560: variable 'ansible_shell_type' from source: unknown 28983 1726882974.52562: variable 'ansible_shell_executable' from source: unknown 28983 1726882974.52564: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882974.52565: variable 'ansible_pipelining' from source: unknown 28983 1726882974.52567: variable 'ansible_timeout' from source: unknown 28983 1726882974.52569: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882974.52699: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726882974.52713: variable 'omit' from source: magic vars 28983 1726882974.52723: starting attempt loop 28983 1726882974.52731: running the handler 28983 1726882974.52759: variable 'lsr_test' from source: include params 28983 1726882974.52844: variable 'lsr_test' from source: include params 28983 1726882974.52870: handler run complete 28983 1726882974.52900: attempt loop complete, returning result 28983 1726882974.52925: variable 'item' from source: unknown 28983 1726882974.53005: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_test) => { "ansible_loop_var": "item", "item": "lsr_test", "lsr_test": [ "tasks/create_bridge_profile.yml" ] } 28983 1726882974.53244: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882974.53248: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882974.53250: variable 'omit' from source: magic vars 28983 1726882974.53427: variable 'ansible_distribution_major_version' from source: facts 28983 1726882974.53441: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882974.53452: variable 'omit' from source: magic vars 28983 1726882974.53495: variable 'omit' from source: magic vars 28983 1726882974.53549: variable 'item' from source: unknown 28983 1726882974.53608: variable 'item' from source: unknown 28983 1726882974.53653: variable 'omit' from source: magic vars 28983 1726882974.53656: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726882974.53659: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882974.53666: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882974.53677: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726882974.53682: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882974.53684: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882974.53744: Set connection var ansible_connection to ssh 28983 1726882974.53754: Set connection var ansible_shell_executable to /bin/sh 28983 1726882974.53762: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726882974.53770: Set connection var ansible_timeout to 10 28983 1726882974.53776: Set connection var ansible_pipelining to False 28983 1726882974.53782: Set connection var ansible_shell_type to sh 28983 1726882974.53797: variable 'ansible_shell_executable' from source: unknown 28983 1726882974.53800: variable 'ansible_connection' from source: unknown 28983 1726882974.53802: variable 'ansible_module_compression' from source: unknown 28983 1726882974.53805: variable 'ansible_shell_type' from source: unknown 28983 1726882974.53810: variable 'ansible_shell_executable' from source: unknown 28983 1726882974.53813: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882974.53819: variable 'ansible_pipelining' from source: unknown 28983 1726882974.53822: variable 'ansible_timeout' from source: unknown 28983 1726882974.53827: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882974.53908: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726882974.53916: variable 'omit' from source: magic vars 28983 1726882974.53919: starting attempt loop 28983 1726882974.53923: running the handler 28983 1726882974.53942: variable 'lsr_assert' from source: include params 28983 1726882974.53996: variable 'lsr_assert' from source: include params 28983 1726882974.54010: handler run complete 28983 1726882974.54022: attempt loop complete, returning result 28983 1726882974.54037: variable 'item' from source: unknown 28983 1726882974.54093: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_assert) => { "ansible_loop_var": "item", "item": "lsr_assert", "lsr_assert": [ "tasks/assert_profile_present.yml" ] } 28983 1726882974.54175: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882974.54190: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882974.54196: variable 'omit' from source: magic vars 28983 1726882974.54343: variable 'ansible_distribution_major_version' from source: facts 28983 1726882974.54347: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882974.54353: variable 'omit' from source: magic vars 28983 1726882974.54366: variable 'omit' from source: magic vars 28983 1726882974.54400: variable 'item' from source: unknown 28983 1726882974.54452: variable 'item' from source: unknown 28983 1726882974.54465: variable 'omit' from source: magic vars 28983 1726882974.54482: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726882974.54488: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882974.54495: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882974.54505: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726882974.54510: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882974.54513: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882974.54570: Set connection var ansible_connection to ssh 28983 1726882974.54581: Set connection var ansible_shell_executable to /bin/sh 28983 1726882974.54588: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726882974.54596: Set connection var ansible_timeout to 10 28983 1726882974.54603: Set connection var ansible_pipelining to False 28983 1726882974.54605: Set connection var ansible_shell_type to sh 28983 1726882974.54621: variable 'ansible_shell_executable' from source: unknown 28983 1726882974.54626: variable 'ansible_connection' from source: unknown 28983 1726882974.54628: variable 'ansible_module_compression' from source: unknown 28983 1726882974.54630: variable 'ansible_shell_type' from source: unknown 28983 1726882974.54643: variable 'ansible_shell_executable' from source: unknown 28983 1726882974.54645: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882974.54648: variable 'ansible_pipelining' from source: unknown 28983 1726882974.54650: variable 'ansible_timeout' from source: unknown 28983 1726882974.54654: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882974.54721: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726882974.54728: variable 'omit' from source: magic vars 28983 1726882974.54735: starting attempt loop 28983 1726882974.54739: running the handler 28983 1726882974.54758: variable 'lsr_assert_when' from source: include params 28983 1726882974.54809: variable 'lsr_assert_when' from source: include params 28983 1726882974.54885: variable 'network_provider' from source: set_fact 28983 1726882974.54911: handler run complete 28983 1726882974.54925: attempt loop complete, returning result 28983 1726882974.54940: variable 'item' from source: unknown 28983 1726882974.54994: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_assert_when) => { "ansible_loop_var": "item", "item": "lsr_assert_when", "lsr_assert_when": [ { "condition": true, "what": "tasks/assert_device_present.yml" } ] } 28983 1726882974.55077: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882974.55091: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882974.55099: variable 'omit' from source: magic vars 28983 1726882974.55220: variable 'ansible_distribution_major_version' from source: facts 28983 1726882974.55224: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882974.55230: variable 'omit' from source: magic vars 28983 1726882974.55245: variable 'omit' from source: magic vars 28983 1726882974.55275: variable 'item' from source: unknown 28983 1726882974.55329: variable 'item' from source: unknown 28983 1726882974.55344: variable 'omit' from source: magic vars 28983 1726882974.55359: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726882974.55366: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882974.55373: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882974.55384: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726882974.55387: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882974.55392: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882974.55449: Set connection var ansible_connection to ssh 28983 1726882974.55458: Set connection var ansible_shell_executable to /bin/sh 28983 1726882974.55466: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726882974.55475: Set connection var ansible_timeout to 10 28983 1726882974.55483: Set connection var ansible_pipelining to False 28983 1726882974.55485: Set connection var ansible_shell_type to sh 28983 1726882974.55500: variable 'ansible_shell_executable' from source: unknown 28983 1726882974.55503: variable 'ansible_connection' from source: unknown 28983 1726882974.55507: variable 'ansible_module_compression' from source: unknown 28983 1726882974.55511: variable 'ansible_shell_type' from source: unknown 28983 1726882974.55515: variable 'ansible_shell_executable' from source: unknown 28983 1726882974.55519: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882974.55530: variable 'ansible_pipelining' from source: unknown 28983 1726882974.55533: variable 'ansible_timeout' from source: unknown 28983 1726882974.55537: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882974.55601: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726882974.55609: variable 'omit' from source: magic vars 28983 1726882974.55613: starting attempt loop 28983 1726882974.55616: running the handler 28983 1726882974.55637: variable 'lsr_fail_debug' from source: play vars 28983 1726882974.55839: variable 'lsr_fail_debug' from source: play vars 28983 1726882974.55843: handler run complete 28983 1726882974.55845: attempt loop complete, returning result 28983 1726882974.55848: variable 'item' from source: unknown 28983 1726882974.55859: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_fail_debug) => { "ansible_loop_var": "item", "item": "lsr_fail_debug", "lsr_fail_debug": [ "__network_connections_result" ] } 28983 1726882974.56012: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882974.56027: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882974.56046: variable 'omit' from source: magic vars 28983 1726882974.56236: variable 'ansible_distribution_major_version' from source: facts 28983 1726882974.56249: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882974.56259: variable 'omit' from source: magic vars 28983 1726882974.56339: variable 'omit' from source: magic vars 28983 1726882974.56342: variable 'item' from source: unknown 28983 1726882974.56393: variable 'item' from source: unknown 28983 1726882974.56412: variable 'omit' from source: magic vars 28983 1726882974.56435: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726882974.56452: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882974.56465: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882974.56483: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726882974.56493: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882974.56502: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882974.56587: Set connection var ansible_connection to ssh 28983 1726882974.56603: Set connection var ansible_shell_executable to /bin/sh 28983 1726882974.56618: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726882974.56740: Set connection var ansible_timeout to 10 28983 1726882974.56743: Set connection var ansible_pipelining to False 28983 1726882974.56746: Set connection var ansible_shell_type to sh 28983 1726882974.56748: variable 'ansible_shell_executable' from source: unknown 28983 1726882974.56750: variable 'ansible_connection' from source: unknown 28983 1726882974.56752: variable 'ansible_module_compression' from source: unknown 28983 1726882974.56754: variable 'ansible_shell_type' from source: unknown 28983 1726882974.56756: variable 'ansible_shell_executable' from source: unknown 28983 1726882974.56758: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882974.56760: variable 'ansible_pipelining' from source: unknown 28983 1726882974.56762: variable 'ansible_timeout' from source: unknown 28983 1726882974.56764: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882974.56812: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726882974.56825: variable 'omit' from source: magic vars 28983 1726882974.56836: starting attempt loop 28983 1726882974.56843: running the handler 28983 1726882974.56867: variable 'lsr_cleanup' from source: include params 28983 1726882974.56941: variable 'lsr_cleanup' from source: include params 28983 1726882974.56962: handler run complete 28983 1726882974.56983: attempt loop complete, returning result 28983 1726882974.57005: variable 'item' from source: unknown 28983 1726882974.57082: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_cleanup) => { "ansible_loop_var": "item", "item": "lsr_cleanup", "lsr_cleanup": [ "tasks/cleanup_profile+device.yml" ] } 28983 1726882974.57240: dumping result to json 28983 1726882974.57243: done dumping result, returning 28983 1726882974.57246: done running TaskExecutor() for managed_node2/TASK: Show item [0affe814-3a2d-b16d-c0a7-000000000092] 28983 1726882974.57248: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000092 28983 1726882974.57385: no more pending results, returning what we have 28983 1726882974.57389: results queue empty 28983 1726882974.57390: checking for any_errors_fatal 28983 1726882974.57397: done checking for any_errors_fatal 28983 1726882974.57398: checking for max_fail_percentage 28983 1726882974.57400: done checking for max_fail_percentage 28983 1726882974.57401: checking to see if all hosts have failed and the running result is not ok 28983 1726882974.57402: done checking to see if all hosts have failed 28983 1726882974.57402: getting the remaining hosts for this loop 28983 1726882974.57404: done getting the remaining hosts for this loop 28983 1726882974.57409: getting the next task for host managed_node2 28983 1726882974.57416: done getting next task for host managed_node2 28983 1726882974.57419: ^ task is: TASK: Include the task 'show_interfaces.yml' 28983 1726882974.57423: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882974.57426: getting variables 28983 1726882974.57427: in VariableManager get_vars() 28983 1726882974.57458: Calling all_inventory to load vars for managed_node2 28983 1726882974.57461: Calling groups_inventory to load vars for managed_node2 28983 1726882974.57465: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882974.57476: Calling all_plugins_play to load vars for managed_node2 28983 1726882974.57481: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882974.57486: Calling groups_plugins_play to load vars for managed_node2 28983 1726882974.57765: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000092 28983 1726882974.57768: WORKER PROCESS EXITING 28983 1726882974.57795: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882974.58115: done with get_vars() 28983 1726882974.58127: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:21 Friday 20 September 2024 21:42:54 -0400 (0:00:00.092) 0:00:04.580 ****** 28983 1726882974.58224: entering _queue_task() for managed_node2/include_tasks 28983 1726882974.58463: worker is 1 (out of 1 available) 28983 1726882974.58476: exiting _queue_task() for managed_node2/include_tasks 28983 1726882974.58489: done queuing things up, now waiting for results queue to drain 28983 1726882974.58491: waiting for pending results... 28983 1726882974.58738: running TaskExecutor() for managed_node2/TASK: Include the task 'show_interfaces.yml' 28983 1726882974.58850: in run() - task 0affe814-3a2d-b16d-c0a7-000000000093 28983 1726882974.58873: variable 'ansible_search_path' from source: unknown 28983 1726882974.58882: variable 'ansible_search_path' from source: unknown 28983 1726882974.58927: calling self._execute() 28983 1726882974.59024: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882974.59043: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882974.59062: variable 'omit' from source: magic vars 28983 1726882974.59471: variable 'ansible_distribution_major_version' from source: facts 28983 1726882974.59490: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882974.59503: _execute() done 28983 1726882974.59515: dumping result to json 28983 1726882974.59524: done dumping result, returning 28983 1726882974.59538: done running TaskExecutor() for managed_node2/TASK: Include the task 'show_interfaces.yml' [0affe814-3a2d-b16d-c0a7-000000000093] 28983 1726882974.59549: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000093 28983 1726882974.59775: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000093 28983 1726882974.59779: WORKER PROCESS EXITING 28983 1726882974.59803: no more pending results, returning what we have 28983 1726882974.59807: in VariableManager get_vars() 28983 1726882974.59841: Calling all_inventory to load vars for managed_node2 28983 1726882974.59844: Calling groups_inventory to load vars for managed_node2 28983 1726882974.59849: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882974.59860: Calling all_plugins_play to load vars for managed_node2 28983 1726882974.59863: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882974.59867: Calling groups_plugins_play to load vars for managed_node2 28983 1726882974.60176: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882974.60495: done with get_vars() 28983 1726882974.60503: variable 'ansible_search_path' from source: unknown 28983 1726882974.60505: variable 'ansible_search_path' from source: unknown 28983 1726882974.60553: we have included files to process 28983 1726882974.60554: generating all_blocks data 28983 1726882974.60556: done generating all_blocks data 28983 1726882974.60561: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 28983 1726882974.60562: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 28983 1726882974.60565: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 28983 1726882974.60739: in VariableManager get_vars() 28983 1726882974.60760: done with get_vars() 28983 1726882974.60887: done processing included file 28983 1726882974.60889: iterating over new_blocks loaded from include file 28983 1726882974.60891: in VariableManager get_vars() 28983 1726882974.60907: done with get_vars() 28983 1726882974.60909: filtering new block on tags 28983 1726882974.60951: done filtering new block on tags 28983 1726882974.60954: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node2 28983 1726882974.60959: extending task lists for all hosts with included blocks 28983 1726882974.61615: done extending task lists 28983 1726882974.61617: done processing included files 28983 1726882974.61618: results queue empty 28983 1726882974.61619: checking for any_errors_fatal 28983 1726882974.61625: done checking for any_errors_fatal 28983 1726882974.61626: checking for max_fail_percentage 28983 1726882974.61627: done checking for max_fail_percentage 28983 1726882974.61628: checking to see if all hosts have failed and the running result is not ok 28983 1726882974.61629: done checking to see if all hosts have failed 28983 1726882974.61630: getting the remaining hosts for this loop 28983 1726882974.61632: done getting the remaining hosts for this loop 28983 1726882974.61636: getting the next task for host managed_node2 28983 1726882974.61641: done getting next task for host managed_node2 28983 1726882974.61643: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 28983 1726882974.61647: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882974.61649: getting variables 28983 1726882974.61650: in VariableManager get_vars() 28983 1726882974.61660: Calling all_inventory to load vars for managed_node2 28983 1726882974.61662: Calling groups_inventory to load vars for managed_node2 28983 1726882974.61665: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882974.61671: Calling all_plugins_play to load vars for managed_node2 28983 1726882974.61674: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882974.61678: Calling groups_plugins_play to load vars for managed_node2 28983 1726882974.61892: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882974.62208: done with get_vars() 28983 1726882974.62218: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 21:42:54 -0400 (0:00:00.040) 0:00:04.620 ****** 28983 1726882974.62301: entering _queue_task() for managed_node2/include_tasks 28983 1726882974.62538: worker is 1 (out of 1 available) 28983 1726882974.62551: exiting _queue_task() for managed_node2/include_tasks 28983 1726882974.62563: done queuing things up, now waiting for results queue to drain 28983 1726882974.62564: waiting for pending results... 28983 1726882974.62954: running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' 28983 1726882974.62958: in run() - task 0affe814-3a2d-b16d-c0a7-0000000000ba 28983 1726882974.62962: variable 'ansible_search_path' from source: unknown 28983 1726882974.62965: variable 'ansible_search_path' from source: unknown 28983 1726882974.63003: calling self._execute() 28983 1726882974.63094: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882974.63106: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882974.63122: variable 'omit' from source: magic vars 28983 1726882974.63524: variable 'ansible_distribution_major_version' from source: facts 28983 1726882974.63547: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882974.63559: _execute() done 28983 1726882974.63568: dumping result to json 28983 1726882974.63576: done dumping result, returning 28983 1726882974.63588: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' [0affe814-3a2d-b16d-c0a7-0000000000ba] 28983 1726882974.63605: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000000ba 28983 1726882974.63857: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000000ba 28983 1726882974.63861: WORKER PROCESS EXITING 28983 1726882974.63886: no more pending results, returning what we have 28983 1726882974.63891: in VariableManager get_vars() 28983 1726882974.63922: Calling all_inventory to load vars for managed_node2 28983 1726882974.63925: Calling groups_inventory to load vars for managed_node2 28983 1726882974.63929: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882974.63942: Calling all_plugins_play to load vars for managed_node2 28983 1726882974.63946: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882974.63950: Calling groups_plugins_play to load vars for managed_node2 28983 1726882974.64282: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882974.64587: done with get_vars() 28983 1726882974.64596: variable 'ansible_search_path' from source: unknown 28983 1726882974.64598: variable 'ansible_search_path' from source: unknown 28983 1726882974.64640: we have included files to process 28983 1726882974.64642: generating all_blocks data 28983 1726882974.64644: done generating all_blocks data 28983 1726882974.64646: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 28983 1726882974.64647: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 28983 1726882974.64650: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 28983 1726882974.65019: done processing included file 28983 1726882974.65021: iterating over new_blocks loaded from include file 28983 1726882974.65023: in VariableManager get_vars() 28983 1726882974.65043: done with get_vars() 28983 1726882974.65045: filtering new block on tags 28983 1726882974.65090: done filtering new block on tags 28983 1726882974.65093: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node2 28983 1726882974.65099: extending task lists for all hosts with included blocks 28983 1726882974.65318: done extending task lists 28983 1726882974.65319: done processing included files 28983 1726882974.65320: results queue empty 28983 1726882974.65321: checking for any_errors_fatal 28983 1726882974.65324: done checking for any_errors_fatal 28983 1726882974.65325: checking for max_fail_percentage 28983 1726882974.65327: done checking for max_fail_percentage 28983 1726882974.65328: checking to see if all hosts have failed and the running result is not ok 28983 1726882974.65329: done checking to see if all hosts have failed 28983 1726882974.65330: getting the remaining hosts for this loop 28983 1726882974.65331: done getting the remaining hosts for this loop 28983 1726882974.65336: getting the next task for host managed_node2 28983 1726882974.65342: done getting next task for host managed_node2 28983 1726882974.65344: ^ task is: TASK: Gather current interface info 28983 1726882974.65348: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882974.65351: getting variables 28983 1726882974.65352: in VariableManager get_vars() 28983 1726882974.65362: Calling all_inventory to load vars for managed_node2 28983 1726882974.65364: Calling groups_inventory to load vars for managed_node2 28983 1726882974.65367: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882974.65373: Calling all_plugins_play to load vars for managed_node2 28983 1726882974.65376: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882974.65380: Calling groups_plugins_play to load vars for managed_node2 28983 1726882974.65590: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882974.65923: done with get_vars() 28983 1726882974.65935: done getting variables 28983 1726882974.65978: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 21:42:54 -0400 (0:00:00.037) 0:00:04.657 ****** 28983 1726882974.66007: entering _queue_task() for managed_node2/command 28983 1726882974.66250: worker is 1 (out of 1 available) 28983 1726882974.66262: exiting _queue_task() for managed_node2/command 28983 1726882974.66274: done queuing things up, now waiting for results queue to drain 28983 1726882974.66276: waiting for pending results... 28983 1726882974.66656: running TaskExecutor() for managed_node2/TASK: Gather current interface info 28983 1726882974.66661: in run() - task 0affe814-3a2d-b16d-c0a7-0000000000f5 28983 1726882974.66672: variable 'ansible_search_path' from source: unknown 28983 1726882974.66680: variable 'ansible_search_path' from source: unknown 28983 1726882974.66720: calling self._execute() 28983 1726882974.66810: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882974.66823: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882974.66843: variable 'omit' from source: magic vars 28983 1726882974.67260: variable 'ansible_distribution_major_version' from source: facts 28983 1726882974.67278: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882974.67291: variable 'omit' from source: magic vars 28983 1726882974.67362: variable 'omit' from source: magic vars 28983 1726882974.67412: variable 'omit' from source: magic vars 28983 1726882974.67463: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726882974.67515: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726882974.67546: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726882974.67624: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882974.67627: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882974.67630: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726882974.67640: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882974.67649: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882974.67773: Set connection var ansible_connection to ssh 28983 1726882974.67793: Set connection var ansible_shell_executable to /bin/sh 28983 1726882974.67809: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726882974.67823: Set connection var ansible_timeout to 10 28983 1726882974.67837: Set connection var ansible_pipelining to False 28983 1726882974.67848: Set connection var ansible_shell_type to sh 28983 1726882974.67940: variable 'ansible_shell_executable' from source: unknown 28983 1726882974.67944: variable 'ansible_connection' from source: unknown 28983 1726882974.67953: variable 'ansible_module_compression' from source: unknown 28983 1726882974.67955: variable 'ansible_shell_type' from source: unknown 28983 1726882974.67958: variable 'ansible_shell_executable' from source: unknown 28983 1726882974.67960: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882974.67962: variable 'ansible_pipelining' from source: unknown 28983 1726882974.67964: variable 'ansible_timeout' from source: unknown 28983 1726882974.67966: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882974.68097: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726882974.68115: variable 'omit' from source: magic vars 28983 1726882974.68126: starting attempt loop 28983 1726882974.68135: running the handler 28983 1726882974.68157: _low_level_execute_command(): starting 28983 1726882974.68175: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726882974.69037: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726882974.69066: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882974.69182: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882974.70955: stdout chunk (state=3): >>>/root <<< 28983 1726882974.71144: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882974.71147: stdout chunk (state=3): >>><<< 28983 1726882974.71150: stderr chunk (state=3): >>><<< 28983 1726882974.71171: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726882974.71270: _low_level_execute_command(): starting 28983 1726882974.71276: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882974.7117753-29189-217163847428025 `" && echo ansible-tmp-1726882974.7117753-29189-217163847428025="` echo /root/.ansible/tmp/ansible-tmp-1726882974.7117753-29189-217163847428025 `" ) && sleep 0' 28983 1726882974.72019: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726882974.72022: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882974.72025: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726882974.72027: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 28983 1726882974.72030: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882974.72122: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882974.72194: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882974.74193: stdout chunk (state=3): >>>ansible-tmp-1726882974.7117753-29189-217163847428025=/root/.ansible/tmp/ansible-tmp-1726882974.7117753-29189-217163847428025 <<< 28983 1726882974.74389: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882974.74407: stdout chunk (state=3): >>><<< 28983 1726882974.74422: stderr chunk (state=3): >>><<< 28983 1726882974.74447: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882974.7117753-29189-217163847428025=/root/.ansible/tmp/ansible-tmp-1726882974.7117753-29189-217163847428025 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726882974.74640: variable 'ansible_module_compression' from source: unknown 28983 1726882974.74643: ANSIBALLZ: Using generic lock for ansible.legacy.command 28983 1726882974.74646: ANSIBALLZ: Acquiring lock 28983 1726882974.74648: ANSIBALLZ: Lock acquired: 140284034522080 28983 1726882974.74650: ANSIBALLZ: Creating module 28983 1726882974.92825: ANSIBALLZ: Writing module into payload 28983 1726882974.92957: ANSIBALLZ: Writing module 28983 1726882974.92995: ANSIBALLZ: Renaming module 28983 1726882974.93007: ANSIBALLZ: Done creating module 28983 1726882974.93029: variable 'ansible_facts' from source: unknown 28983 1726882974.93122: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882974.7117753-29189-217163847428025/AnsiballZ_command.py 28983 1726882974.93325: Sending initial data 28983 1726882974.93329: Sent initial data (156 bytes) 28983 1726882974.93995: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726882974.94052: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882974.94137: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726882974.94157: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726882974.94198: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882974.94311: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882974.96048: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 28983 1726882974.96064: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726882974.96167: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726882974.96249: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmprqzev74d /root/.ansible/tmp/ansible-tmp-1726882974.7117753-29189-217163847428025/AnsiballZ_command.py <<< 28983 1726882974.96253: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882974.7117753-29189-217163847428025/AnsiballZ_command.py" <<< 28983 1726882974.96317: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmprqzev74d" to remote "/root/.ansible/tmp/ansible-tmp-1726882974.7117753-29189-217163847428025/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882974.7117753-29189-217163847428025/AnsiballZ_command.py" <<< 28983 1726882974.97913: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882974.97917: stderr chunk (state=3): >>><<< 28983 1726882974.98044: stdout chunk (state=3): >>><<< 28983 1726882974.98048: done transferring module to remote 28983 1726882974.98050: _low_level_execute_command(): starting 28983 1726882974.98053: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882974.7117753-29189-217163847428025/ /root/.ansible/tmp/ansible-tmp-1726882974.7117753-29189-217163847428025/AnsiballZ_command.py && sleep 0' 28983 1726882974.98626: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726882974.98644: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726882974.98661: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882974.98680: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726882974.98698: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726882974.98804: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726882974.98832: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882974.98927: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882975.00851: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882975.00939: stderr chunk (state=3): >>><<< 28983 1726882975.00952: stdout chunk (state=3): >>><<< 28983 1726882975.00979: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726882975.00988: _low_level_execute_command(): starting 28983 1726882975.00999: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882974.7117753-29189-217163847428025/AnsiballZ_command.py && sleep 0' 28983 1726882975.01620: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726882975.01638: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726882975.01655: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882975.01674: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726882975.01703: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726882975.01716: stderr chunk (state=3): >>>debug2: match not found <<< 28983 1726882975.01750: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882975.01805: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882975.01857: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726882975.01883: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726882975.01921: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882975.02000: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882975.19643: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:42:55.191608", "end": "2024-09-20 21:42:55.195154", "delta": "0:00:00.003546", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 28983 1726882975.21873: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. <<< 28983 1726882975.21877: stdout chunk (state=3): >>><<< 28983 1726882975.21880: stderr chunk (state=3): >>><<< 28983 1726882975.22042: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:42:55.191608", "end": "2024-09-20 21:42:55.195154", "delta": "0:00:00.003546", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726882975.22047: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882974.7117753-29189-217163847428025/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726882975.22049: _low_level_execute_command(): starting 28983 1726882975.22052: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882974.7117753-29189-217163847428025/ > /dev/null 2>&1 && sleep 0' 28983 1726882975.22666: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726882975.22680: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726882975.22697: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882975.22725: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726882975.22840: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 28983 1726882975.22869: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726882975.22884: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882975.22989: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882975.24948: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882975.25006: stderr chunk (state=3): >>><<< 28983 1726882975.25026: stdout chunk (state=3): >>><<< 28983 1726882975.25052: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726882975.25065: handler run complete 28983 1726882975.25100: Evaluated conditional (False): False 28983 1726882975.25120: attempt loop complete, returning result 28983 1726882975.25139: _execute() done 28983 1726882975.25142: dumping result to json 28983 1726882975.25234: done dumping result, returning 28983 1726882975.25238: done running TaskExecutor() for managed_node2/TASK: Gather current interface info [0affe814-3a2d-b16d-c0a7-0000000000f5] 28983 1726882975.25242: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000000f5 28983 1726882975.25321: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000000f5 28983 1726882975.25324: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003546", "end": "2024-09-20 21:42:55.195154", "rc": 0, "start": "2024-09-20 21:42:55.191608" } STDOUT: bonding_masters eth0 lo 28983 1726882975.25425: no more pending results, returning what we have 28983 1726882975.25429: results queue empty 28983 1726882975.25430: checking for any_errors_fatal 28983 1726882975.25432: done checking for any_errors_fatal 28983 1726882975.25435: checking for max_fail_percentage 28983 1726882975.25437: done checking for max_fail_percentage 28983 1726882975.25438: checking to see if all hosts have failed and the running result is not ok 28983 1726882975.25439: done checking to see if all hosts have failed 28983 1726882975.25440: getting the remaining hosts for this loop 28983 1726882975.25442: done getting the remaining hosts for this loop 28983 1726882975.25447: getting the next task for host managed_node2 28983 1726882975.25456: done getting next task for host managed_node2 28983 1726882975.25459: ^ task is: TASK: Set current_interfaces 28983 1726882975.25464: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882975.25468: getting variables 28983 1726882975.25470: in VariableManager get_vars() 28983 1726882975.25503: Calling all_inventory to load vars for managed_node2 28983 1726882975.25507: Calling groups_inventory to load vars for managed_node2 28983 1726882975.25511: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882975.25524: Calling all_plugins_play to load vars for managed_node2 28983 1726882975.25528: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882975.25531: Calling groups_plugins_play to load vars for managed_node2 28983 1726882975.26181: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882975.26522: done with get_vars() 28983 1726882975.26538: done getting variables 28983 1726882975.26604: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 21:42:55 -0400 (0:00:00.606) 0:00:05.264 ****** 28983 1726882975.26647: entering _queue_task() for managed_node2/set_fact 28983 1726882975.26912: worker is 1 (out of 1 available) 28983 1726882975.26924: exiting _queue_task() for managed_node2/set_fact 28983 1726882975.27052: done queuing things up, now waiting for results queue to drain 28983 1726882975.27054: waiting for pending results... 28983 1726882975.27291: running TaskExecutor() for managed_node2/TASK: Set current_interfaces 28983 1726882975.27378: in run() - task 0affe814-3a2d-b16d-c0a7-0000000000f6 28983 1726882975.27385: variable 'ansible_search_path' from source: unknown 28983 1726882975.27488: variable 'ansible_search_path' from source: unknown 28983 1726882975.27491: calling self._execute() 28983 1726882975.27519: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882975.27531: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882975.27549: variable 'omit' from source: magic vars 28983 1726882975.28059: variable 'ansible_distribution_major_version' from source: facts 28983 1726882975.28076: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882975.28088: variable 'omit' from source: magic vars 28983 1726882975.28167: variable 'omit' from source: magic vars 28983 1726882975.28322: variable '_current_interfaces' from source: set_fact 28983 1726882975.28401: variable 'omit' from source: magic vars 28983 1726882975.28451: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726882975.28506: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726882975.28535: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726882975.28563: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882975.28589: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882975.28628: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726882975.28687: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882975.28692: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882975.28777: Set connection var ansible_connection to ssh 28983 1726882975.28802: Set connection var ansible_shell_executable to /bin/sh 28983 1726882975.28905: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726882975.28908: Set connection var ansible_timeout to 10 28983 1726882975.28911: Set connection var ansible_pipelining to False 28983 1726882975.28913: Set connection var ansible_shell_type to sh 28983 1726882975.28915: variable 'ansible_shell_executable' from source: unknown 28983 1726882975.28918: variable 'ansible_connection' from source: unknown 28983 1726882975.28920: variable 'ansible_module_compression' from source: unknown 28983 1726882975.28923: variable 'ansible_shell_type' from source: unknown 28983 1726882975.28925: variable 'ansible_shell_executable' from source: unknown 28983 1726882975.28927: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882975.28929: variable 'ansible_pipelining' from source: unknown 28983 1726882975.28931: variable 'ansible_timeout' from source: unknown 28983 1726882975.28933: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882975.29103: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726882975.29125: variable 'omit' from source: magic vars 28983 1726882975.29138: starting attempt loop 28983 1726882975.29147: running the handler 28983 1726882975.29167: handler run complete 28983 1726882975.29181: attempt loop complete, returning result 28983 1726882975.29231: _execute() done 28983 1726882975.29234: dumping result to json 28983 1726882975.29238: done dumping result, returning 28983 1726882975.29242: done running TaskExecutor() for managed_node2/TASK: Set current_interfaces [0affe814-3a2d-b16d-c0a7-0000000000f6] 28983 1726882975.29244: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000000f6 28983 1726882975.29441: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000000f6 28983 1726882975.29451: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 28983 1726882975.29513: no more pending results, returning what we have 28983 1726882975.29516: results queue empty 28983 1726882975.29517: checking for any_errors_fatal 28983 1726882975.29524: done checking for any_errors_fatal 28983 1726882975.29525: checking for max_fail_percentage 28983 1726882975.29527: done checking for max_fail_percentage 28983 1726882975.29528: checking to see if all hosts have failed and the running result is not ok 28983 1726882975.29529: done checking to see if all hosts have failed 28983 1726882975.29530: getting the remaining hosts for this loop 28983 1726882975.29532: done getting the remaining hosts for this loop 28983 1726882975.29539: getting the next task for host managed_node2 28983 1726882975.29548: done getting next task for host managed_node2 28983 1726882975.29550: ^ task is: TASK: Show current_interfaces 28983 1726882975.29637: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882975.29643: getting variables 28983 1726882975.29644: in VariableManager get_vars() 28983 1726882975.29675: Calling all_inventory to load vars for managed_node2 28983 1726882975.29678: Calling groups_inventory to load vars for managed_node2 28983 1726882975.29681: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882975.29691: Calling all_plugins_play to load vars for managed_node2 28983 1726882975.29695: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882975.29699: Calling groups_plugins_play to load vars for managed_node2 28983 1726882975.29974: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882975.30353: done with get_vars() 28983 1726882975.30364: done getting variables 28983 1726882975.30429: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 21:42:55 -0400 (0:00:00.038) 0:00:05.302 ****** 28983 1726882975.30469: entering _queue_task() for managed_node2/debug 28983 1726882975.30786: worker is 1 (out of 1 available) 28983 1726882975.30814: exiting _queue_task() for managed_node2/debug 28983 1726882975.30830: done queuing things up, now waiting for results queue to drain 28983 1726882975.30832: waiting for pending results... 28983 1726882975.31023: running TaskExecutor() for managed_node2/TASK: Show current_interfaces 28983 1726882975.31199: in run() - task 0affe814-3a2d-b16d-c0a7-0000000000bb 28983 1726882975.31203: variable 'ansible_search_path' from source: unknown 28983 1726882975.31206: variable 'ansible_search_path' from source: unknown 28983 1726882975.31229: calling self._execute() 28983 1726882975.31323: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882975.31338: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882975.31356: variable 'omit' from source: magic vars 28983 1726882975.31776: variable 'ansible_distribution_major_version' from source: facts 28983 1726882975.31852: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882975.31855: variable 'omit' from source: magic vars 28983 1726882975.31870: variable 'omit' from source: magic vars 28983 1726882975.31993: variable 'current_interfaces' from source: set_fact 28983 1726882975.32027: variable 'omit' from source: magic vars 28983 1726882975.32082: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726882975.32130: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726882975.32160: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726882975.32193: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882975.32210: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882975.32288: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726882975.32292: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882975.32295: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882975.32396: Set connection var ansible_connection to ssh 28983 1726882975.32416: Set connection var ansible_shell_executable to /bin/sh 28983 1726882975.32435: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726882975.32454: Set connection var ansible_timeout to 10 28983 1726882975.32506: Set connection var ansible_pipelining to False 28983 1726882975.32510: Set connection var ansible_shell_type to sh 28983 1726882975.32512: variable 'ansible_shell_executable' from source: unknown 28983 1726882975.32516: variable 'ansible_connection' from source: unknown 28983 1726882975.32525: variable 'ansible_module_compression' from source: unknown 28983 1726882975.32537: variable 'ansible_shell_type' from source: unknown 28983 1726882975.32546: variable 'ansible_shell_executable' from source: unknown 28983 1726882975.32554: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882975.32615: variable 'ansible_pipelining' from source: unknown 28983 1726882975.32619: variable 'ansible_timeout' from source: unknown 28983 1726882975.32622: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882975.32757: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726882975.32775: variable 'omit' from source: magic vars 28983 1726882975.32789: starting attempt loop 28983 1726882975.32797: running the handler 28983 1726882975.32891: handler run complete 28983 1726882975.32895: attempt loop complete, returning result 28983 1726882975.32897: _execute() done 28983 1726882975.32900: dumping result to json 28983 1726882975.32905: done dumping result, returning 28983 1726882975.32918: done running TaskExecutor() for managed_node2/TASK: Show current_interfaces [0affe814-3a2d-b16d-c0a7-0000000000bb] 28983 1726882975.32928: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000000bb 28983 1726882975.33081: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000000bb 28983 1726882975.33085: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 28983 1726882975.33298: no more pending results, returning what we have 28983 1726882975.33301: results queue empty 28983 1726882975.33302: checking for any_errors_fatal 28983 1726882975.33308: done checking for any_errors_fatal 28983 1726882975.33309: checking for max_fail_percentage 28983 1726882975.33310: done checking for max_fail_percentage 28983 1726882975.33312: checking to see if all hosts have failed and the running result is not ok 28983 1726882975.33313: done checking to see if all hosts have failed 28983 1726882975.33314: getting the remaining hosts for this loop 28983 1726882975.33315: done getting the remaining hosts for this loop 28983 1726882975.33319: getting the next task for host managed_node2 28983 1726882975.33328: done getting next task for host managed_node2 28983 1726882975.33331: ^ task is: TASK: Setup 28983 1726882975.33337: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882975.33342: getting variables 28983 1726882975.33344: in VariableManager get_vars() 28983 1726882975.33371: Calling all_inventory to load vars for managed_node2 28983 1726882975.33375: Calling groups_inventory to load vars for managed_node2 28983 1726882975.33378: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882975.33387: Calling all_plugins_play to load vars for managed_node2 28983 1726882975.33391: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882975.33395: Calling groups_plugins_play to load vars for managed_node2 28983 1726882975.33656: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882975.33973: done with get_vars() 28983 1726882975.33985: done getting variables TASK [Setup] ******************************************************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:24 Friday 20 September 2024 21:42:55 -0400 (0:00:00.036) 0:00:05.338 ****** 28983 1726882975.34093: entering _queue_task() for managed_node2/include_tasks 28983 1726882975.34398: worker is 1 (out of 1 available) 28983 1726882975.34412: exiting _queue_task() for managed_node2/include_tasks 28983 1726882975.34425: done queuing things up, now waiting for results queue to drain 28983 1726882975.34427: waiting for pending results... 28983 1726882975.34602: running TaskExecutor() for managed_node2/TASK: Setup 28983 1726882975.34669: in run() - task 0affe814-3a2d-b16d-c0a7-000000000094 28983 1726882975.34721: variable 'ansible_search_path' from source: unknown 28983 1726882975.34725: variable 'ansible_search_path' from source: unknown 28983 1726882975.34729: variable 'lsr_setup' from source: include params 28983 1726882975.34895: variable 'lsr_setup' from source: include params 28983 1726882975.34952: variable 'omit' from source: magic vars 28983 1726882975.35101: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882975.35111: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882975.35121: variable 'omit' from source: magic vars 28983 1726882975.35307: variable 'ansible_distribution_major_version' from source: facts 28983 1726882975.35316: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882975.35323: variable 'item' from source: unknown 28983 1726882975.35381: variable 'item' from source: unknown 28983 1726882975.35410: variable 'item' from source: unknown 28983 1726882975.35460: variable 'item' from source: unknown 28983 1726882975.35592: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882975.35595: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882975.35598: variable 'omit' from source: magic vars 28983 1726882975.35713: variable 'ansible_distribution_major_version' from source: facts 28983 1726882975.35720: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882975.35723: variable 'item' from source: unknown 28983 1726882975.35771: variable 'item' from source: unknown 28983 1726882975.35797: variable 'item' from source: unknown 28983 1726882975.35853: variable 'item' from source: unknown 28983 1726882975.35920: dumping result to json 28983 1726882975.35923: done dumping result, returning 28983 1726882975.35933: done running TaskExecutor() for managed_node2/TASK: Setup [0affe814-3a2d-b16d-c0a7-000000000094] 28983 1726882975.35938: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000094 28983 1726882975.35981: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000094 28983 1726882975.35984: WORKER PROCESS EXITING 28983 1726882975.36017: no more pending results, returning what we have 28983 1726882975.36021: in VariableManager get_vars() 28983 1726882975.36057: Calling all_inventory to load vars for managed_node2 28983 1726882975.36060: Calling groups_inventory to load vars for managed_node2 28983 1726882975.36064: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882975.36073: Calling all_plugins_play to load vars for managed_node2 28983 1726882975.36076: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882975.36083: Calling groups_plugins_play to load vars for managed_node2 28983 1726882975.36268: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882975.36469: done with get_vars() 28983 1726882975.36481: variable 'ansible_search_path' from source: unknown 28983 1726882975.36486: variable 'ansible_search_path' from source: unknown 28983 1726882975.36531: variable 'ansible_search_path' from source: unknown 28983 1726882975.36532: variable 'ansible_search_path' from source: unknown 28983 1726882975.36575: we have included files to process 28983 1726882975.36577: generating all_blocks data 28983 1726882975.36579: done generating all_blocks data 28983 1726882975.36582: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 28983 1726882975.36584: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 28983 1726882975.36587: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 28983 1726882975.36847: done processing included file 28983 1726882975.36850: iterating over new_blocks loaded from include file 28983 1726882975.36851: in VariableManager get_vars() 28983 1726882975.36867: done with get_vars() 28983 1726882975.36869: filtering new block on tags 28983 1726882975.36907: done filtering new block on tags 28983 1726882975.36910: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml for managed_node2 => (item=tasks/delete_interface.yml) 28983 1726882975.36915: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 28983 1726882975.36916: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 28983 1726882975.36919: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 28983 1726882975.37066: in VariableManager get_vars() 28983 1726882975.37087: done with get_vars() 28983 1726882975.37230: done processing included file 28983 1726882975.37233: iterating over new_blocks loaded from include file 28983 1726882975.37238: in VariableManager get_vars() 28983 1726882975.37254: done with get_vars() 28983 1726882975.37256: filtering new block on tags 28983 1726882975.37299: done filtering new block on tags 28983 1726882975.37303: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed_node2 => (item=tasks/assert_device_absent.yml) 28983 1726882975.37307: extending task lists for all hosts with included blocks 28983 1726882975.37962: done extending task lists 28983 1726882975.37963: done processing included files 28983 1726882975.37964: results queue empty 28983 1726882975.37964: checking for any_errors_fatal 28983 1726882975.37966: done checking for any_errors_fatal 28983 1726882975.37967: checking for max_fail_percentage 28983 1726882975.37968: done checking for max_fail_percentage 28983 1726882975.37968: checking to see if all hosts have failed and the running result is not ok 28983 1726882975.37969: done checking to see if all hosts have failed 28983 1726882975.37970: getting the remaining hosts for this loop 28983 1726882975.37971: done getting the remaining hosts for this loop 28983 1726882975.37973: getting the next task for host managed_node2 28983 1726882975.37976: done getting next task for host managed_node2 28983 1726882975.37977: ^ task is: TASK: Remove test interface if necessary 28983 1726882975.37980: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882975.37982: getting variables 28983 1726882975.37983: in VariableManager get_vars() 28983 1726882975.37993: Calling all_inventory to load vars for managed_node2 28983 1726882975.37995: Calling groups_inventory to load vars for managed_node2 28983 1726882975.37997: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882975.38001: Calling all_plugins_play to load vars for managed_node2 28983 1726882975.38003: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882975.38005: Calling groups_plugins_play to load vars for managed_node2 28983 1726882975.38127: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882975.38300: done with get_vars() 28983 1726882975.38307: done getting variables 28983 1726882975.38341: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Remove test interface if necessary] ************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml:3 Friday 20 September 2024 21:42:55 -0400 (0:00:00.042) 0:00:05.381 ****** 28983 1726882975.38362: entering _queue_task() for managed_node2/command 28983 1726882975.38539: worker is 1 (out of 1 available) 28983 1726882975.38554: exiting _queue_task() for managed_node2/command 28983 1726882975.38567: done queuing things up, now waiting for results queue to drain 28983 1726882975.38569: waiting for pending results... 28983 1726882975.38732: running TaskExecutor() for managed_node2/TASK: Remove test interface if necessary 28983 1726882975.38808: in run() - task 0affe814-3a2d-b16d-c0a7-00000000011b 28983 1726882975.38819: variable 'ansible_search_path' from source: unknown 28983 1726882975.38823: variable 'ansible_search_path' from source: unknown 28983 1726882975.38859: calling self._execute() 28983 1726882975.38925: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882975.38932: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882975.38944: variable 'omit' from source: magic vars 28983 1726882975.39248: variable 'ansible_distribution_major_version' from source: facts 28983 1726882975.39259: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882975.39265: variable 'omit' from source: magic vars 28983 1726882975.39307: variable 'omit' from source: magic vars 28983 1726882975.39389: variable 'interface' from source: play vars 28983 1726882975.39404: variable 'omit' from source: magic vars 28983 1726882975.39439: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726882975.39473: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726882975.39493: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726882975.39509: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882975.39518: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882975.39546: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726882975.39550: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882975.39561: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882975.39638: Set connection var ansible_connection to ssh 28983 1726882975.39648: Set connection var ansible_shell_executable to /bin/sh 28983 1726882975.39657: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726882975.39669: Set connection var ansible_timeout to 10 28983 1726882975.39675: Set connection var ansible_pipelining to False 28983 1726882975.39677: Set connection var ansible_shell_type to sh 28983 1726882975.39700: variable 'ansible_shell_executable' from source: unknown 28983 1726882975.39704: variable 'ansible_connection' from source: unknown 28983 1726882975.39706: variable 'ansible_module_compression' from source: unknown 28983 1726882975.39709: variable 'ansible_shell_type' from source: unknown 28983 1726882975.39713: variable 'ansible_shell_executable' from source: unknown 28983 1726882975.39716: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882975.39722: variable 'ansible_pipelining' from source: unknown 28983 1726882975.39729: variable 'ansible_timeout' from source: unknown 28983 1726882975.39743: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882975.39891: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726882975.40074: variable 'omit' from source: magic vars 28983 1726882975.40077: starting attempt loop 28983 1726882975.40082: running the handler 28983 1726882975.40084: _low_level_execute_command(): starting 28983 1726882975.40086: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726882975.40696: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882975.40700: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882975.40705: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882975.40773: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726882975.40794: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882975.40870: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882975.42655: stdout chunk (state=3): >>>/root <<< 28983 1726882975.42806: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882975.42811: stdout chunk (state=3): >>><<< 28983 1726882975.42819: stderr chunk (state=3): >>><<< 28983 1726882975.42839: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726882975.42853: _low_level_execute_command(): starting 28983 1726882975.42859: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882975.42839-29210-202123355305735 `" && echo ansible-tmp-1726882975.42839-29210-202123355305735="` echo /root/.ansible/tmp/ansible-tmp-1726882975.42839-29210-202123355305735 `" ) && sleep 0' 28983 1726882975.43351: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882975.43354: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882975.43357: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882975.43366: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882975.43437: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726882975.43440: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882975.43514: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882975.45553: stdout chunk (state=3): >>>ansible-tmp-1726882975.42839-29210-202123355305735=/root/.ansible/tmp/ansible-tmp-1726882975.42839-29210-202123355305735 <<< 28983 1726882975.45671: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882975.45714: stderr chunk (state=3): >>><<< 28983 1726882975.45718: stdout chunk (state=3): >>><<< 28983 1726882975.45733: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882975.42839-29210-202123355305735=/root/.ansible/tmp/ansible-tmp-1726882975.42839-29210-202123355305735 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726882975.45761: variable 'ansible_module_compression' from source: unknown 28983 1726882975.45802: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 28983 1726882975.45829: variable 'ansible_facts' from source: unknown 28983 1726882975.45898: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882975.42839-29210-202123355305735/AnsiballZ_command.py 28983 1726882975.46023: Sending initial data 28983 1726882975.46026: Sent initial data (154 bytes) 28983 1726882975.46458: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726882975.46462: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726882975.46466: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration <<< 28983 1726882975.46469: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found <<< 28983 1726882975.46471: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882975.46530: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726882975.46532: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882975.46600: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882975.48259: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726882975.48324: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726882975.48393: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpqo4t5ymn /root/.ansible/tmp/ansible-tmp-1726882975.42839-29210-202123355305735/AnsiballZ_command.py <<< 28983 1726882975.48396: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882975.42839-29210-202123355305735/AnsiballZ_command.py" <<< 28983 1726882975.48456: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpqo4t5ymn" to remote "/root/.ansible/tmp/ansible-tmp-1726882975.42839-29210-202123355305735/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882975.42839-29210-202123355305735/AnsiballZ_command.py" <<< 28983 1726882975.49347: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882975.49404: stderr chunk (state=3): >>><<< 28983 1726882975.49407: stdout chunk (state=3): >>><<< 28983 1726882975.49426: done transferring module to remote 28983 1726882975.49436: _low_level_execute_command(): starting 28983 1726882975.49441: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882975.42839-29210-202123355305735/ /root/.ansible/tmp/ansible-tmp-1726882975.42839-29210-202123355305735/AnsiballZ_command.py && sleep 0' 28983 1726882975.49857: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882975.49860: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882975.49862: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28983 1726882975.49865: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882975.49922: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726882975.49927: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882975.49991: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882975.51966: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882975.51969: stdout chunk (state=3): >>><<< 28983 1726882975.51972: stderr chunk (state=3): >>><<< 28983 1726882975.51990: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726882975.52000: _low_level_execute_command(): starting 28983 1726882975.52083: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882975.42839-29210-202123355305735/AnsiballZ_command.py && sleep 0' 28983 1726882975.52779: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882975.52821: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726882975.52825: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882975.52915: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882975.71105: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "Cannot find device \"statebr\"", "rc": 1, "cmd": ["ip", "link", "del", "statebr"], "start": "2024-09-20 21:42:55.702167", "end": "2024-09-20 21:42:55.709878", "delta": "0:00:00.007711", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del statebr", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 28983 1726882975.72775: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.46.139 closed. <<< 28983 1726882975.72803: stdout chunk (state=3): >>><<< 28983 1726882975.72806: stderr chunk (state=3): >>><<< 28983 1726882975.72823: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "Cannot find device \"statebr\"", "rc": 1, "cmd": ["ip", "link", "del", "statebr"], "start": "2024-09-20 21:42:55.702167", "end": "2024-09-20 21:42:55.709878", "delta": "0:00:00.007711", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del statebr", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.46.139 closed. 28983 1726882975.72958: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882975.42839-29210-202123355305735/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726882975.72963: _low_level_execute_command(): starting 28983 1726882975.72965: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882975.42839-29210-202123355305735/ > /dev/null 2>&1 && sleep 0' 28983 1726882975.73646: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 28983 1726882975.73668: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726882975.73685: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882975.73797: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882975.75768: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882975.75826: stderr chunk (state=3): >>><<< 28983 1726882975.75838: stdout chunk (state=3): >>><<< 28983 1726882975.75866: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726882975.75882: handler run complete 28983 1726882975.75916: Evaluated conditional (False): False 28983 1726882975.75936: attempt loop complete, returning result 28983 1726882975.75945: _execute() done 28983 1726882975.75952: dumping result to json 28983 1726882975.75963: done dumping result, returning 28983 1726882975.76039: done running TaskExecutor() for managed_node2/TASK: Remove test interface if necessary [0affe814-3a2d-b16d-c0a7-00000000011b] 28983 1726882975.76042: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000011b fatal: [managed_node2]: FAILED! => { "changed": false, "cmd": [ "ip", "link", "del", "statebr" ], "delta": "0:00:00.007711", "end": "2024-09-20 21:42:55.709878", "rc": 1, "start": "2024-09-20 21:42:55.702167" } STDERR: Cannot find device "statebr" MSG: non-zero return code ...ignoring 28983 1726882975.76287: no more pending results, returning what we have 28983 1726882975.76293: results queue empty 28983 1726882975.76294: checking for any_errors_fatal 28983 1726882975.76409: done checking for any_errors_fatal 28983 1726882975.76411: checking for max_fail_percentage 28983 1726882975.76413: done checking for max_fail_percentage 28983 1726882975.76414: checking to see if all hosts have failed and the running result is not ok 28983 1726882975.76415: done checking to see if all hosts have failed 28983 1726882975.76416: getting the remaining hosts for this loop 28983 1726882975.76419: done getting the remaining hosts for this loop 28983 1726882975.76424: getting the next task for host managed_node2 28983 1726882975.76435: done getting next task for host managed_node2 28983 1726882975.76439: ^ task is: TASK: Include the task 'get_interface_stat.yml' 28983 1726882975.76445: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882975.76449: getting variables 28983 1726882975.76451: in VariableManager get_vars() 28983 1726882975.76488: Calling all_inventory to load vars for managed_node2 28983 1726882975.76491: Calling groups_inventory to load vars for managed_node2 28983 1726882975.76496: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882975.76544: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000011b 28983 1726882975.76547: WORKER PROCESS EXITING 28983 1726882975.76558: Calling all_plugins_play to load vars for managed_node2 28983 1726882975.76562: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882975.76566: Calling groups_plugins_play to load vars for managed_node2 28983 1726882975.76975: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882975.77323: done with get_vars() 28983 1726882975.77339: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Friday 20 September 2024 21:42:55 -0400 (0:00:00.390) 0:00:05.772 ****** 28983 1726882975.77452: entering _queue_task() for managed_node2/include_tasks 28983 1726882975.77763: worker is 1 (out of 1 available) 28983 1726882975.77774: exiting _queue_task() for managed_node2/include_tasks 28983 1726882975.77789: done queuing things up, now waiting for results queue to drain 28983 1726882975.77791: waiting for pending results... 28983 1726882975.78038: running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' 28983 1726882975.78189: in run() - task 0affe814-3a2d-b16d-c0a7-00000000011f 28983 1726882975.78241: variable 'ansible_search_path' from source: unknown 28983 1726882975.78244: variable 'ansible_search_path' from source: unknown 28983 1726882975.78266: calling self._execute() 28983 1726882975.78357: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882975.78375: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882975.78440: variable 'omit' from source: magic vars 28983 1726882975.78884: variable 'ansible_distribution_major_version' from source: facts 28983 1726882975.78888: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882975.78890: _execute() done 28983 1726882975.78894: dumping result to json 28983 1726882975.78896: done dumping result, returning 28983 1726882975.78901: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' [0affe814-3a2d-b16d-c0a7-00000000011f] 28983 1726882975.78912: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000011f 28983 1726882975.79166: no more pending results, returning what we have 28983 1726882975.79171: in VariableManager get_vars() 28983 1726882975.79208: Calling all_inventory to load vars for managed_node2 28983 1726882975.79211: Calling groups_inventory to load vars for managed_node2 28983 1726882975.79215: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882975.79227: Calling all_plugins_play to load vars for managed_node2 28983 1726882975.79231: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882975.79237: Calling groups_plugins_play to load vars for managed_node2 28983 1726882975.79573: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000011f 28983 1726882975.79577: WORKER PROCESS EXITING 28983 1726882975.79605: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882975.79974: done with get_vars() 28983 1726882975.79985: variable 'ansible_search_path' from source: unknown 28983 1726882975.79987: variable 'ansible_search_path' from source: unknown 28983 1726882975.79996: variable 'item' from source: include params 28983 1726882975.80126: variable 'item' from source: include params 28983 1726882975.80174: we have included files to process 28983 1726882975.80175: generating all_blocks data 28983 1726882975.80178: done generating all_blocks data 28983 1726882975.80186: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 28983 1726882975.80187: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 28983 1726882975.80195: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 28983 1726882975.80469: done processing included file 28983 1726882975.80471: iterating over new_blocks loaded from include file 28983 1726882975.80473: in VariableManager get_vars() 28983 1726882975.80493: done with get_vars() 28983 1726882975.80495: filtering new block on tags 28983 1726882975.80533: done filtering new block on tags 28983 1726882975.80539: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node2 28983 1726882975.80545: extending task lists for all hosts with included blocks 28983 1726882975.80769: done extending task lists 28983 1726882975.80771: done processing included files 28983 1726882975.80772: results queue empty 28983 1726882975.80773: checking for any_errors_fatal 28983 1726882975.80777: done checking for any_errors_fatal 28983 1726882975.80780: checking for max_fail_percentage 28983 1726882975.80782: done checking for max_fail_percentage 28983 1726882975.80783: checking to see if all hosts have failed and the running result is not ok 28983 1726882975.80784: done checking to see if all hosts have failed 28983 1726882975.80785: getting the remaining hosts for this loop 28983 1726882975.80786: done getting the remaining hosts for this loop 28983 1726882975.80789: getting the next task for host managed_node2 28983 1726882975.80795: done getting next task for host managed_node2 28983 1726882975.80797: ^ task is: TASK: Get stat for interface {{ interface }} 28983 1726882975.80802: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882975.80804: getting variables 28983 1726882975.80805: in VariableManager get_vars() 28983 1726882975.80815: Calling all_inventory to load vars for managed_node2 28983 1726882975.80818: Calling groups_inventory to load vars for managed_node2 28983 1726882975.80821: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882975.80826: Calling all_plugins_play to load vars for managed_node2 28983 1726882975.80830: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882975.80836: Calling groups_plugins_play to load vars for managed_node2 28983 1726882975.81057: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882975.81363: done with get_vars() 28983 1726882975.81373: done getting variables 28983 1726882975.81531: variable 'interface' from source: play vars TASK [Get stat for interface statebr] ****************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:42:55 -0400 (0:00:00.041) 0:00:05.813 ****** 28983 1726882975.81566: entering _queue_task() for managed_node2/stat 28983 1726882975.81902: worker is 1 (out of 1 available) 28983 1726882975.81915: exiting _queue_task() for managed_node2/stat 28983 1726882975.81927: done queuing things up, now waiting for results queue to drain 28983 1726882975.81929: waiting for pending results... 28983 1726882975.82165: running TaskExecutor() for managed_node2/TASK: Get stat for interface statebr 28983 1726882975.82321: in run() - task 0affe814-3a2d-b16d-c0a7-00000000016e 28983 1726882975.82346: variable 'ansible_search_path' from source: unknown 28983 1726882975.82385: variable 'ansible_search_path' from source: unknown 28983 1726882975.82412: calling self._execute() 28983 1726882975.82509: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882975.82604: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882975.82609: variable 'omit' from source: magic vars 28983 1726882975.83347: variable 'ansible_distribution_major_version' from source: facts 28983 1726882975.83371: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882975.83389: variable 'omit' from source: magic vars 28983 1726882975.83458: variable 'omit' from source: magic vars 28983 1726882975.83600: variable 'interface' from source: play vars 28983 1726882975.83625: variable 'omit' from source: magic vars 28983 1726882975.83699: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726882975.83738: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726882975.83764: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726882975.83790: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882975.83916: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882975.83921: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726882975.83924: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882975.83927: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882975.83997: Set connection var ansible_connection to ssh 28983 1726882975.84016: Set connection var ansible_shell_executable to /bin/sh 28983 1726882975.84040: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726882975.84056: Set connection var ansible_timeout to 10 28983 1726882975.84134: Set connection var ansible_pipelining to False 28983 1726882975.84138: Set connection var ansible_shell_type to sh 28983 1726882975.84141: variable 'ansible_shell_executable' from source: unknown 28983 1726882975.84144: variable 'ansible_connection' from source: unknown 28983 1726882975.84147: variable 'ansible_module_compression' from source: unknown 28983 1726882975.84149: variable 'ansible_shell_type' from source: unknown 28983 1726882975.84152: variable 'ansible_shell_executable' from source: unknown 28983 1726882975.84154: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882975.84157: variable 'ansible_pipelining' from source: unknown 28983 1726882975.84159: variable 'ansible_timeout' from source: unknown 28983 1726882975.84161: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882975.84461: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28983 1726882975.84466: variable 'omit' from source: magic vars 28983 1726882975.84468: starting attempt loop 28983 1726882975.84471: running the handler 28983 1726882975.84473: _low_level_execute_command(): starting 28983 1726882975.84475: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726882975.85276: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882975.85349: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882975.85425: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726882975.85444: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726882975.85476: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882975.85588: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882975.87380: stdout chunk (state=3): >>>/root <<< 28983 1726882975.87569: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882975.87572: stdout chunk (state=3): >>><<< 28983 1726882975.87575: stderr chunk (state=3): >>><<< 28983 1726882975.87703: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726882975.87707: _low_level_execute_command(): starting 28983 1726882975.87710: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882975.8760004-29233-59865804228995 `" && echo ansible-tmp-1726882975.8760004-29233-59865804228995="` echo /root/.ansible/tmp/ansible-tmp-1726882975.8760004-29233-59865804228995 `" ) && sleep 0' 28983 1726882975.88300: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726882975.88323: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882975.88429: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882975.90482: stdout chunk (state=3): >>>ansible-tmp-1726882975.8760004-29233-59865804228995=/root/.ansible/tmp/ansible-tmp-1726882975.8760004-29233-59865804228995 <<< 28983 1726882975.90666: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882975.90676: stdout chunk (state=3): >>><<< 28983 1726882975.90688: stderr chunk (state=3): >>><<< 28983 1726882975.90713: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882975.8760004-29233-59865804228995=/root/.ansible/tmp/ansible-tmp-1726882975.8760004-29233-59865804228995 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726882975.90765: variable 'ansible_module_compression' from source: unknown 28983 1726882975.90836: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 28983 1726882975.90875: variable 'ansible_facts' from source: unknown 28983 1726882975.91044: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882975.8760004-29233-59865804228995/AnsiballZ_stat.py 28983 1726882975.91163: Sending initial data 28983 1726882975.91173: Sent initial data (152 bytes) 28983 1726882975.91773: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726882975.91788: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726882975.91843: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882975.91857: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882975.91942: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726882975.91967: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726882975.91981: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882975.92076: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882975.93785: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726882975.93876: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726882975.93943: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpsh85eeyp /root/.ansible/tmp/ansible-tmp-1726882975.8760004-29233-59865804228995/AnsiballZ_stat.py <<< 28983 1726882975.93970: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882975.8760004-29233-59865804228995/AnsiballZ_stat.py" <<< 28983 1726882975.94040: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpsh85eeyp" to remote "/root/.ansible/tmp/ansible-tmp-1726882975.8760004-29233-59865804228995/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882975.8760004-29233-59865804228995/AnsiballZ_stat.py" <<< 28983 1726882975.95371: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882975.95456: stderr chunk (state=3): >>><<< 28983 1726882975.95466: stdout chunk (state=3): >>><<< 28983 1726882975.95490: done transferring module to remote 28983 1726882975.95500: _low_level_execute_command(): starting 28983 1726882975.95505: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882975.8760004-29233-59865804228995/ /root/.ansible/tmp/ansible-tmp-1726882975.8760004-29233-59865804228995/AnsiballZ_stat.py && sleep 0' 28983 1726882975.95955: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726882975.95958: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726882975.95961: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration <<< 28983 1726882975.95964: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found <<< 28983 1726882975.95967: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882975.96020: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726882975.96025: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882975.96101: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882975.98062: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882975.98124: stderr chunk (state=3): >>><<< 28983 1726882975.98127: stdout chunk (state=3): >>><<< 28983 1726882975.98136: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726882975.98139: _low_level_execute_command(): starting 28983 1726882975.98144: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882975.8760004-29233-59865804228995/AnsiballZ_stat.py && sleep 0' 28983 1726882975.98599: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882975.98602: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882975.98605: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration <<< 28983 1726882975.98607: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882975.98609: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882975.98667: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726882975.98670: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882975.98750: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882976.16398: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 28983 1726882976.17968: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. <<< 28983 1726882976.17972: stdout chunk (state=3): >>><<< 28983 1726882976.17974: stderr chunk (state=3): >>><<< 28983 1726882976.17996: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726882976.18145: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882975.8760004-29233-59865804228995/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726882976.18149: _low_level_execute_command(): starting 28983 1726882976.18151: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882975.8760004-29233-59865804228995/ > /dev/null 2>&1 && sleep 0' 28983 1726882976.18769: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726882976.18795: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726882976.18928: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726882976.18964: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882976.19077: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882976.21242: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882976.21246: stdout chunk (state=3): >>><<< 28983 1726882976.21248: stderr chunk (state=3): >>><<< 28983 1726882976.21251: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726882976.21258: handler run complete 28983 1726882976.21260: attempt loop complete, returning result 28983 1726882976.21262: _execute() done 28983 1726882976.21266: dumping result to json 28983 1726882976.21268: done dumping result, returning 28983 1726882976.21270: done running TaskExecutor() for managed_node2/TASK: Get stat for interface statebr [0affe814-3a2d-b16d-c0a7-00000000016e] 28983 1726882976.21289: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000016e 28983 1726882976.21476: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000016e 28983 1726882976.21482: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 28983 1726882976.21562: no more pending results, returning what we have 28983 1726882976.21566: results queue empty 28983 1726882976.21567: checking for any_errors_fatal 28983 1726882976.21569: done checking for any_errors_fatal 28983 1726882976.21570: checking for max_fail_percentage 28983 1726882976.21572: done checking for max_fail_percentage 28983 1726882976.21574: checking to see if all hosts have failed and the running result is not ok 28983 1726882976.21575: done checking to see if all hosts have failed 28983 1726882976.21576: getting the remaining hosts for this loop 28983 1726882976.21580: done getting the remaining hosts for this loop 28983 1726882976.21585: getting the next task for host managed_node2 28983 1726882976.21596: done getting next task for host managed_node2 28983 1726882976.21601: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 28983 1726882976.21606: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882976.21611: getting variables 28983 1726882976.21612: in VariableManager get_vars() 28983 1726882976.21651: Calling all_inventory to load vars for managed_node2 28983 1726882976.21655: Calling groups_inventory to load vars for managed_node2 28983 1726882976.21659: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882976.21672: Calling all_plugins_play to load vars for managed_node2 28983 1726882976.21675: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882976.21682: Calling groups_plugins_play to load vars for managed_node2 28983 1726882976.22507: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882976.22850: done with get_vars() 28983 1726882976.22863: done getting variables 28983 1726882976.22982: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 28983 1726882976.23124: variable 'interface' from source: play vars TASK [Assert that the interface is absent - 'statebr'] ************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Friday 20 September 2024 21:42:56 -0400 (0:00:00.415) 0:00:06.229 ****** 28983 1726882976.23166: entering _queue_task() for managed_node2/assert 28983 1726882976.23168: Creating lock for assert 28983 1726882976.23601: worker is 1 (out of 1 available) 28983 1726882976.23615: exiting _queue_task() for managed_node2/assert 28983 1726882976.23627: done queuing things up, now waiting for results queue to drain 28983 1726882976.23629: waiting for pending results... 28983 1726882976.23910: running TaskExecutor() for managed_node2/TASK: Assert that the interface is absent - 'statebr' 28983 1726882976.24007: in run() - task 0affe814-3a2d-b16d-c0a7-000000000120 28983 1726882976.24011: variable 'ansible_search_path' from source: unknown 28983 1726882976.24014: variable 'ansible_search_path' from source: unknown 28983 1726882976.24057: calling self._execute() 28983 1726882976.24224: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882976.24228: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882976.24231: variable 'omit' from source: magic vars 28983 1726882976.24632: variable 'ansible_distribution_major_version' from source: facts 28983 1726882976.24659: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882976.24674: variable 'omit' from source: magic vars 28983 1726882976.24741: variable 'omit' from source: magic vars 28983 1726882976.24883: variable 'interface' from source: play vars 28983 1726882976.24909: variable 'omit' from source: magic vars 28983 1726882976.24963: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726882976.25020: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726882976.25050: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726882976.25098: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882976.25102: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882976.25143: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726882976.25208: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882976.25211: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882976.25297: Set connection var ansible_connection to ssh 28983 1726882976.25324: Set connection var ansible_shell_executable to /bin/sh 28983 1726882976.25344: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726882976.25359: Set connection var ansible_timeout to 10 28983 1726882976.25370: Set connection var ansible_pipelining to False 28983 1726882976.25381: Set connection var ansible_shell_type to sh 28983 1726882976.25412: variable 'ansible_shell_executable' from source: unknown 28983 1726882976.25429: variable 'ansible_connection' from source: unknown 28983 1726882976.25535: variable 'ansible_module_compression' from source: unknown 28983 1726882976.25541: variable 'ansible_shell_type' from source: unknown 28983 1726882976.25543: variable 'ansible_shell_executable' from source: unknown 28983 1726882976.25546: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882976.25548: variable 'ansible_pipelining' from source: unknown 28983 1726882976.25550: variable 'ansible_timeout' from source: unknown 28983 1726882976.25552: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882976.25649: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726882976.25666: variable 'omit' from source: magic vars 28983 1726882976.25675: starting attempt loop 28983 1726882976.25684: running the handler 28983 1726882976.25873: variable 'interface_stat' from source: set_fact 28983 1726882976.25894: Evaluated conditional (not interface_stat.stat.exists): True 28983 1726882976.25908: handler run complete 28983 1726882976.25931: attempt loop complete, returning result 28983 1726882976.25942: _execute() done 28983 1726882976.25951: dumping result to json 28983 1726882976.25960: done dumping result, returning 28983 1726882976.25984: done running TaskExecutor() for managed_node2/TASK: Assert that the interface is absent - 'statebr' [0affe814-3a2d-b16d-c0a7-000000000120] 28983 1726882976.25996: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000120 ok: [managed_node2] => { "changed": false } MSG: All assertions passed 28983 1726882976.26290: no more pending results, returning what we have 28983 1726882976.26296: results queue empty 28983 1726882976.26297: checking for any_errors_fatal 28983 1726882976.26307: done checking for any_errors_fatal 28983 1726882976.26309: checking for max_fail_percentage 28983 1726882976.26311: done checking for max_fail_percentage 28983 1726882976.26312: checking to see if all hosts have failed and the running result is not ok 28983 1726882976.26313: done checking to see if all hosts have failed 28983 1726882976.26314: getting the remaining hosts for this loop 28983 1726882976.26316: done getting the remaining hosts for this loop 28983 1726882976.26322: getting the next task for host managed_node2 28983 1726882976.26331: done getting next task for host managed_node2 28983 1726882976.26337: ^ task is: TASK: Test 28983 1726882976.26341: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882976.26347: getting variables 28983 1726882976.26349: in VariableManager get_vars() 28983 1726882976.26388: Calling all_inventory to load vars for managed_node2 28983 1726882976.26392: Calling groups_inventory to load vars for managed_node2 28983 1726882976.26397: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882976.26410: Calling all_plugins_play to load vars for managed_node2 28983 1726882976.26414: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882976.26419: Calling groups_plugins_play to load vars for managed_node2 28983 1726882976.26901: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000120 28983 1726882976.26906: WORKER PROCESS EXITING 28983 1726882976.26937: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882976.27313: done with get_vars() 28983 1726882976.27332: done getting variables TASK [Test] ******************************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:30 Friday 20 September 2024 21:42:56 -0400 (0:00:00.042) 0:00:06.272 ****** 28983 1726882976.27453: entering _queue_task() for managed_node2/include_tasks 28983 1726882976.27751: worker is 1 (out of 1 available) 28983 1726882976.27881: exiting _queue_task() for managed_node2/include_tasks 28983 1726882976.27894: done queuing things up, now waiting for results queue to drain 28983 1726882976.27896: waiting for pending results... 28983 1726882976.28095: running TaskExecutor() for managed_node2/TASK: Test 28983 1726882976.28295: in run() - task 0affe814-3a2d-b16d-c0a7-000000000095 28983 1726882976.28299: variable 'ansible_search_path' from source: unknown 28983 1726882976.28302: variable 'ansible_search_path' from source: unknown 28983 1726882976.28325: variable 'lsr_test' from source: include params 28983 1726882976.28652: variable 'lsr_test' from source: include params 28983 1726882976.28708: variable 'omit' from source: magic vars 28983 1726882976.28895: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882976.28975: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882976.28979: variable 'omit' from source: magic vars 28983 1726882976.29270: variable 'ansible_distribution_major_version' from source: facts 28983 1726882976.29282: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882976.29300: variable 'item' from source: unknown 28983 1726882976.29384: variable 'item' from source: unknown 28983 1726882976.29439: variable 'item' from source: unknown 28983 1726882976.29641: variable 'item' from source: unknown 28983 1726882976.29758: dumping result to json 28983 1726882976.30048: done dumping result, returning 28983 1726882976.30051: done running TaskExecutor() for managed_node2/TASK: Test [0affe814-3a2d-b16d-c0a7-000000000095] 28983 1726882976.30054: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000095 28983 1726882976.30249: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000095 28983 1726882976.30253: WORKER PROCESS EXITING 28983 1726882976.30278: no more pending results, returning what we have 28983 1726882976.30282: in VariableManager get_vars() 28983 1726882976.30310: Calling all_inventory to load vars for managed_node2 28983 1726882976.30314: Calling groups_inventory to load vars for managed_node2 28983 1726882976.30316: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882976.30324: Calling all_plugins_play to load vars for managed_node2 28983 1726882976.30326: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882976.30332: Calling groups_plugins_play to load vars for managed_node2 28983 1726882976.30517: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882976.30702: done with get_vars() 28983 1726882976.30711: variable 'ansible_search_path' from source: unknown 28983 1726882976.30712: variable 'ansible_search_path' from source: unknown 28983 1726882976.30745: we have included files to process 28983 1726882976.30746: generating all_blocks data 28983 1726882976.30747: done generating all_blocks data 28983 1726882976.30751: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 28983 1726882976.30752: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 28983 1726882976.30754: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 28983 1726882976.31014: done processing included file 28983 1726882976.31016: iterating over new_blocks loaded from include file 28983 1726882976.31017: in VariableManager get_vars() 28983 1726882976.31028: done with get_vars() 28983 1726882976.31029: filtering new block on tags 28983 1726882976.31059: done filtering new block on tags 28983 1726882976.31061: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml for managed_node2 => (item=tasks/create_bridge_profile.yml) 28983 1726882976.31065: extending task lists for all hosts with included blocks 28983 1726882976.31786: done extending task lists 28983 1726882976.31787: done processing included files 28983 1726882976.31788: results queue empty 28983 1726882976.31788: checking for any_errors_fatal 28983 1726882976.31790: done checking for any_errors_fatal 28983 1726882976.31791: checking for max_fail_percentage 28983 1726882976.31792: done checking for max_fail_percentage 28983 1726882976.31793: checking to see if all hosts have failed and the running result is not ok 28983 1726882976.31794: done checking to see if all hosts have failed 28983 1726882976.31794: getting the remaining hosts for this loop 28983 1726882976.31796: done getting the remaining hosts for this loop 28983 1726882976.31798: getting the next task for host managed_node2 28983 1726882976.31802: done getting next task for host managed_node2 28983 1726882976.31804: ^ task is: TASK: Include network role 28983 1726882976.31806: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882976.31808: getting variables 28983 1726882976.31808: in VariableManager get_vars() 28983 1726882976.31815: Calling all_inventory to load vars for managed_node2 28983 1726882976.31817: Calling groups_inventory to load vars for managed_node2 28983 1726882976.31818: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882976.31822: Calling all_plugins_play to load vars for managed_node2 28983 1726882976.31824: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882976.31827: Calling groups_plugins_play to load vars for managed_node2 28983 1726882976.32013: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882976.32322: done with get_vars() 28983 1726882976.32333: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml:3 Friday 20 September 2024 21:42:56 -0400 (0:00:00.049) 0:00:06.322 ****** 28983 1726882976.32561: entering _queue_task() for managed_node2/include_role 28983 1726882976.32564: Creating lock for include_role 28983 1726882976.33162: worker is 1 (out of 1 available) 28983 1726882976.33174: exiting _queue_task() for managed_node2/include_role 28983 1726882976.33189: done queuing things up, now waiting for results queue to drain 28983 1726882976.33190: waiting for pending results... 28983 1726882976.33781: running TaskExecutor() for managed_node2/TASK: Include network role 28983 1726882976.33876: in run() - task 0affe814-3a2d-b16d-c0a7-00000000018e 28983 1726882976.33902: variable 'ansible_search_path' from source: unknown 28983 1726882976.33910: variable 'ansible_search_path' from source: unknown 28983 1726882976.33954: calling self._execute() 28983 1726882976.34046: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882976.34060: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882976.34096: variable 'omit' from source: magic vars 28983 1726882976.34538: variable 'ansible_distribution_major_version' from source: facts 28983 1726882976.34642: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882976.34646: _execute() done 28983 1726882976.34649: dumping result to json 28983 1726882976.34651: done dumping result, returning 28983 1726882976.34658: done running TaskExecutor() for managed_node2/TASK: Include network role [0affe814-3a2d-b16d-c0a7-00000000018e] 28983 1726882976.34661: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000018e 28983 1726882976.34745: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000018e 28983 1726882976.34748: WORKER PROCESS EXITING 28983 1726882976.34790: no more pending results, returning what we have 28983 1726882976.34795: in VariableManager get_vars() 28983 1726882976.34824: Calling all_inventory to load vars for managed_node2 28983 1726882976.34827: Calling groups_inventory to load vars for managed_node2 28983 1726882976.34830: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882976.34841: Calling all_plugins_play to load vars for managed_node2 28983 1726882976.34845: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882976.34848: Calling groups_plugins_play to load vars for managed_node2 28983 1726882976.35125: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882976.35452: done with get_vars() 28983 1726882976.35461: variable 'ansible_search_path' from source: unknown 28983 1726882976.35463: variable 'ansible_search_path' from source: unknown 28983 1726882976.35895: variable 'omit' from source: magic vars 28983 1726882976.35950: variable 'omit' from source: magic vars 28983 1726882976.35969: variable 'omit' from source: magic vars 28983 1726882976.35973: we have included files to process 28983 1726882976.35974: generating all_blocks data 28983 1726882976.35976: done generating all_blocks data 28983 1726882976.35977: processing included file: fedora.linux_system_roles.network 28983 1726882976.36006: in VariableManager get_vars() 28983 1726882976.36019: done with get_vars() 28983 1726882976.36412: in VariableManager get_vars() 28983 1726882976.36432: done with get_vars() 28983 1726882976.36491: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 28983 1726882976.37040: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 28983 1726882976.37229: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 28983 1726882976.38261: in VariableManager get_vars() 28983 1726882976.38288: done with get_vars() 28983 1726882976.38855: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 28983 1726882976.41326: iterating over new_blocks loaded from include file 28983 1726882976.41328: in VariableManager get_vars() 28983 1726882976.41350: done with get_vars() 28983 1726882976.41351: filtering new block on tags 28983 1726882976.41859: done filtering new block on tags 28983 1726882976.41864: in VariableManager get_vars() 28983 1726882976.41886: done with get_vars() 28983 1726882976.41888: filtering new block on tags 28983 1726882976.41908: done filtering new block on tags 28983 1726882976.41911: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node2 28983 1726882976.41917: extending task lists for all hosts with included blocks 28983 1726882976.42358: done extending task lists 28983 1726882976.42359: done processing included files 28983 1726882976.42361: results queue empty 28983 1726882976.42361: checking for any_errors_fatal 28983 1726882976.42366: done checking for any_errors_fatal 28983 1726882976.42367: checking for max_fail_percentage 28983 1726882976.42368: done checking for max_fail_percentage 28983 1726882976.42369: checking to see if all hosts have failed and the running result is not ok 28983 1726882976.42370: done checking to see if all hosts have failed 28983 1726882976.42371: getting the remaining hosts for this loop 28983 1726882976.42372: done getting the remaining hosts for this loop 28983 1726882976.42375: getting the next task for host managed_node2 28983 1726882976.42384: done getting next task for host managed_node2 28983 1726882976.42390: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 28983 1726882976.42396: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882976.42598: getting variables 28983 1726882976.42600: in VariableManager get_vars() 28983 1726882976.42614: Calling all_inventory to load vars for managed_node2 28983 1726882976.42620: Calling groups_inventory to load vars for managed_node2 28983 1726882976.42623: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882976.42629: Calling all_plugins_play to load vars for managed_node2 28983 1726882976.42631: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882976.42635: Calling groups_plugins_play to load vars for managed_node2 28983 1726882976.43029: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882976.43381: done with get_vars() 28983 1726882976.43398: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:42:56 -0400 (0:00:00.110) 0:00:06.432 ****** 28983 1726882976.43500: entering _queue_task() for managed_node2/include_tasks 28983 1726882976.44115: worker is 1 (out of 1 available) 28983 1726882976.44128: exiting _queue_task() for managed_node2/include_tasks 28983 1726882976.44146: done queuing things up, now waiting for results queue to drain 28983 1726882976.44148: waiting for pending results... 28983 1726882976.44465: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 28983 1726882976.44715: in run() - task 0affe814-3a2d-b16d-c0a7-00000000020c 28983 1726882976.44720: variable 'ansible_search_path' from source: unknown 28983 1726882976.44723: variable 'ansible_search_path' from source: unknown 28983 1726882976.44739: calling self._execute() 28983 1726882976.44848: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882976.44861: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882976.44877: variable 'omit' from source: magic vars 28983 1726882976.45443: variable 'ansible_distribution_major_version' from source: facts 28983 1726882976.45587: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882976.45591: _execute() done 28983 1726882976.45594: dumping result to json 28983 1726882976.45596: done dumping result, returning 28983 1726882976.45599: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affe814-3a2d-b16d-c0a7-00000000020c] 28983 1726882976.45601: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000020c 28983 1726882976.45736: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000020c 28983 1726882976.45789: no more pending results, returning what we have 28983 1726882976.45796: in VariableManager get_vars() 28983 1726882976.45845: Calling all_inventory to load vars for managed_node2 28983 1726882976.45848: Calling groups_inventory to load vars for managed_node2 28983 1726882976.45851: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882976.45864: Calling all_plugins_play to load vars for managed_node2 28983 1726882976.45868: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882976.45872: Calling groups_plugins_play to load vars for managed_node2 28983 1726882976.46330: WORKER PROCESS EXITING 28983 1726882976.46359: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882976.46837: done with get_vars() 28983 1726882976.46848: variable 'ansible_search_path' from source: unknown 28983 1726882976.46849: variable 'ansible_search_path' from source: unknown 28983 1726882976.46906: we have included files to process 28983 1726882976.46907: generating all_blocks data 28983 1726882976.46909: done generating all_blocks data 28983 1726882976.46914: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 28983 1726882976.46915: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 28983 1726882976.46918: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 28983 1726882976.47992: done processing included file 28983 1726882976.47994: iterating over new_blocks loaded from include file 28983 1726882976.47996: in VariableManager get_vars() 28983 1726882976.48025: done with get_vars() 28983 1726882976.48027: filtering new block on tags 28983 1726882976.48073: done filtering new block on tags 28983 1726882976.48077: in VariableManager get_vars() 28983 1726882976.48109: done with get_vars() 28983 1726882976.48111: filtering new block on tags 28983 1726882976.48182: done filtering new block on tags 28983 1726882976.48186: in VariableManager get_vars() 28983 1726882976.48213: done with get_vars() 28983 1726882976.48215: filtering new block on tags 28983 1726882976.48277: done filtering new block on tags 28983 1726882976.48286: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node2 28983 1726882976.48292: extending task lists for all hosts with included blocks 28983 1726882976.50877: done extending task lists 28983 1726882976.50881: done processing included files 28983 1726882976.50882: results queue empty 28983 1726882976.50883: checking for any_errors_fatal 28983 1726882976.50887: done checking for any_errors_fatal 28983 1726882976.50888: checking for max_fail_percentage 28983 1726882976.50890: done checking for max_fail_percentage 28983 1726882976.50891: checking to see if all hosts have failed and the running result is not ok 28983 1726882976.50892: done checking to see if all hosts have failed 28983 1726882976.50893: getting the remaining hosts for this loop 28983 1726882976.50895: done getting the remaining hosts for this loop 28983 1726882976.50902: getting the next task for host managed_node2 28983 1726882976.50909: done getting next task for host managed_node2 28983 1726882976.50912: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 28983 1726882976.50918: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882976.50931: getting variables 28983 1726882976.50932: in VariableManager get_vars() 28983 1726882976.50950: Calling all_inventory to load vars for managed_node2 28983 1726882976.50953: Calling groups_inventory to load vars for managed_node2 28983 1726882976.50956: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882976.50962: Calling all_plugins_play to load vars for managed_node2 28983 1726882976.50965: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882976.50969: Calling groups_plugins_play to load vars for managed_node2 28983 1726882976.51205: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882976.51566: done with get_vars() 28983 1726882976.51580: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:42:56 -0400 (0:00:00.081) 0:00:06.514 ****** 28983 1726882976.51677: entering _queue_task() for managed_node2/setup 28983 1726882976.52028: worker is 1 (out of 1 available) 28983 1726882976.52046: exiting _queue_task() for managed_node2/setup 28983 1726882976.52059: done queuing things up, now waiting for results queue to drain 28983 1726882976.52061: waiting for pending results... 28983 1726882976.52367: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 28983 1726882976.52571: in run() - task 0affe814-3a2d-b16d-c0a7-000000000269 28983 1726882976.52598: variable 'ansible_search_path' from source: unknown 28983 1726882976.52607: variable 'ansible_search_path' from source: unknown 28983 1726882976.52661: calling self._execute() 28983 1726882976.52763: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882976.52777: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882976.52796: variable 'omit' from source: magic vars 28983 1726882976.53556: variable 'ansible_distribution_major_version' from source: facts 28983 1726882976.53576: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882976.54073: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726882976.56918: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726882976.57012: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726882976.57074: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726882976.57140: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726882976.57176: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726882976.57286: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726882976.57341: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726882976.57385: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726882976.57456: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726882976.57481: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726882976.57562: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726882976.57599: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726882976.57653: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726882976.57708: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726882976.57761: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726882976.57950: variable '__network_required_facts' from source: role '' defaults 28983 1726882976.57964: variable 'ansible_facts' from source: unknown 28983 1726882976.58100: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 28983 1726882976.58109: when evaluation is False, skipping this task 28983 1726882976.58138: _execute() done 28983 1726882976.58142: dumping result to json 28983 1726882976.58145: done dumping result, returning 28983 1726882976.58148: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affe814-3a2d-b16d-c0a7-000000000269] 28983 1726882976.58156: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000269 28983 1726882976.58381: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000269 28983 1726882976.58384: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28983 1726882976.58438: no more pending results, returning what we have 28983 1726882976.58443: results queue empty 28983 1726882976.58444: checking for any_errors_fatal 28983 1726882976.58446: done checking for any_errors_fatal 28983 1726882976.58447: checking for max_fail_percentage 28983 1726882976.58449: done checking for max_fail_percentage 28983 1726882976.58451: checking to see if all hosts have failed and the running result is not ok 28983 1726882976.58451: done checking to see if all hosts have failed 28983 1726882976.58452: getting the remaining hosts for this loop 28983 1726882976.58455: done getting the remaining hosts for this loop 28983 1726882976.58460: getting the next task for host managed_node2 28983 1726882976.58473: done getting next task for host managed_node2 28983 1726882976.58481: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 28983 1726882976.58489: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882976.58507: getting variables 28983 1726882976.58509: in VariableManager get_vars() 28983 1726882976.58555: Calling all_inventory to load vars for managed_node2 28983 1726882976.58559: Calling groups_inventory to load vars for managed_node2 28983 1726882976.58563: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882976.58574: Calling all_plugins_play to load vars for managed_node2 28983 1726882976.58581: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882976.58595: Calling groups_plugins_play to load vars for managed_node2 28983 1726882976.59519: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882976.59881: done with get_vars() 28983 1726882976.59893: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:42:56 -0400 (0:00:00.083) 0:00:06.597 ****** 28983 1726882976.60010: entering _queue_task() for managed_node2/stat 28983 1726882976.60407: worker is 1 (out of 1 available) 28983 1726882976.60420: exiting _queue_task() for managed_node2/stat 28983 1726882976.60432: done queuing things up, now waiting for results queue to drain 28983 1726882976.60437: waiting for pending results... 28983 1726882976.60729: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 28983 1726882976.60939: in run() - task 0affe814-3a2d-b16d-c0a7-00000000026b 28983 1726882976.60945: variable 'ansible_search_path' from source: unknown 28983 1726882976.60947: variable 'ansible_search_path' from source: unknown 28983 1726882976.60951: calling self._execute() 28983 1726882976.61031: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882976.61054: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882976.61071: variable 'omit' from source: magic vars 28983 1726882976.61554: variable 'ansible_distribution_major_version' from source: facts 28983 1726882976.61575: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882976.61798: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726882976.62137: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726882976.62200: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726882976.62252: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726882976.62355: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726882976.62437: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726882976.62484: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726882976.62573: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726882976.62577: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726882976.62670: variable '__network_is_ostree' from source: set_fact 28983 1726882976.62696: Evaluated conditional (not __network_is_ostree is defined): False 28983 1726882976.62705: when evaluation is False, skipping this task 28983 1726882976.62713: _execute() done 28983 1726882976.62722: dumping result to json 28983 1726882976.62731: done dumping result, returning 28983 1726882976.62788: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affe814-3a2d-b16d-c0a7-00000000026b] 28983 1726882976.62794: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000026b skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 28983 1726882976.62960: no more pending results, returning what we have 28983 1726882976.62964: results queue empty 28983 1726882976.62965: checking for any_errors_fatal 28983 1726882976.62972: done checking for any_errors_fatal 28983 1726882976.62973: checking for max_fail_percentage 28983 1726882976.62975: done checking for max_fail_percentage 28983 1726882976.62977: checking to see if all hosts have failed and the running result is not ok 28983 1726882976.62978: done checking to see if all hosts have failed 28983 1726882976.62981: getting the remaining hosts for this loop 28983 1726882976.62983: done getting the remaining hosts for this loop 28983 1726882976.62988: getting the next task for host managed_node2 28983 1726882976.62999: done getting next task for host managed_node2 28983 1726882976.63003: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 28983 1726882976.63010: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882976.63025: getting variables 28983 1726882976.63027: in VariableManager get_vars() 28983 1726882976.63065: Calling all_inventory to load vars for managed_node2 28983 1726882976.63068: Calling groups_inventory to load vars for managed_node2 28983 1726882976.63071: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882976.63083: Calling all_plugins_play to load vars for managed_node2 28983 1726882976.63087: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882976.63091: Calling groups_plugins_play to load vars for managed_node2 28983 1726882976.63530: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000026b 28983 1726882976.63537: WORKER PROCESS EXITING 28983 1726882976.63563: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882976.63960: done with get_vars() 28983 1726882976.63972: done getting variables 28983 1726882976.64043: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:42:56 -0400 (0:00:00.040) 0:00:06.638 ****** 28983 1726882976.64085: entering _queue_task() for managed_node2/set_fact 28983 1726882976.64448: worker is 1 (out of 1 available) 28983 1726882976.64459: exiting _queue_task() for managed_node2/set_fact 28983 1726882976.64470: done queuing things up, now waiting for results queue to drain 28983 1726882976.64472: waiting for pending results... 28983 1726882976.64645: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 28983 1726882976.64837: in run() - task 0affe814-3a2d-b16d-c0a7-00000000026c 28983 1726882976.64858: variable 'ansible_search_path' from source: unknown 28983 1726882976.64868: variable 'ansible_search_path' from source: unknown 28983 1726882976.64922: calling self._execute() 28983 1726882976.65017: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882976.65033: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882976.65051: variable 'omit' from source: magic vars 28983 1726882976.65486: variable 'ansible_distribution_major_version' from source: facts 28983 1726882976.65505: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882976.65723: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726882976.66119: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726882976.66214: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726882976.66306: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726882976.66369: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726882976.66544: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726882976.66588: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726882976.66672: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726882976.66714: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726882976.66899: variable '__network_is_ostree' from source: set_fact 28983 1726882976.66903: Evaluated conditional (not __network_is_ostree is defined): False 28983 1726882976.66906: when evaluation is False, skipping this task 28983 1726882976.66908: _execute() done 28983 1726882976.66910: dumping result to json 28983 1726882976.66913: done dumping result, returning 28983 1726882976.66916: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affe814-3a2d-b16d-c0a7-00000000026c] 28983 1726882976.66924: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000026c skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 28983 1726882976.67199: no more pending results, returning what we have 28983 1726882976.67204: results queue empty 28983 1726882976.67205: checking for any_errors_fatal 28983 1726882976.67214: done checking for any_errors_fatal 28983 1726882976.67215: checking for max_fail_percentage 28983 1726882976.67223: done checking for max_fail_percentage 28983 1726882976.67224: checking to see if all hosts have failed and the running result is not ok 28983 1726882976.67225: done checking to see if all hosts have failed 28983 1726882976.67226: getting the remaining hosts for this loop 28983 1726882976.67228: done getting the remaining hosts for this loop 28983 1726882976.67235: getting the next task for host managed_node2 28983 1726882976.67247: done getting next task for host managed_node2 28983 1726882976.67251: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 28983 1726882976.67259: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882976.67276: getting variables 28983 1726882976.67279: in VariableManager get_vars() 28983 1726882976.67318: Calling all_inventory to load vars for managed_node2 28983 1726882976.67322: Calling groups_inventory to load vars for managed_node2 28983 1726882976.67325: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882976.67445: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000026c 28983 1726882976.67448: WORKER PROCESS EXITING 28983 1726882976.67457: Calling all_plugins_play to load vars for managed_node2 28983 1726882976.67460: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882976.67464: Calling groups_plugins_play to load vars for managed_node2 28983 1726882976.67745: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882976.68114: done with get_vars() 28983 1726882976.68127: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:42:56 -0400 (0:00:00.041) 0:00:06.680 ****** 28983 1726882976.68242: entering _queue_task() for managed_node2/service_facts 28983 1726882976.68244: Creating lock for service_facts 28983 1726882976.68503: worker is 1 (out of 1 available) 28983 1726882976.68515: exiting _queue_task() for managed_node2/service_facts 28983 1726882976.68643: done queuing things up, now waiting for results queue to drain 28983 1726882976.68647: waiting for pending results... 28983 1726882976.68822: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running 28983 1726882976.69022: in run() - task 0affe814-3a2d-b16d-c0a7-00000000026e 28983 1726882976.69048: variable 'ansible_search_path' from source: unknown 28983 1726882976.69056: variable 'ansible_search_path' from source: unknown 28983 1726882976.69108: calling self._execute() 28983 1726882976.69210: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882976.69223: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882976.69241: variable 'omit' from source: magic vars 28983 1726882976.69676: variable 'ansible_distribution_major_version' from source: facts 28983 1726882976.69699: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882976.69710: variable 'omit' from source: magic vars 28983 1726882976.69843: variable 'omit' from source: magic vars 28983 1726882976.69873: variable 'omit' from source: magic vars 28983 1726882976.69929: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726882976.69995: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726882976.70024: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726882976.70054: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882976.70081: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882976.70120: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726882976.70129: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882976.70140: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882976.70268: Set connection var ansible_connection to ssh 28983 1726882976.70390: Set connection var ansible_shell_executable to /bin/sh 28983 1726882976.70393: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726882976.70397: Set connection var ansible_timeout to 10 28983 1726882976.70399: Set connection var ansible_pipelining to False 28983 1726882976.70402: Set connection var ansible_shell_type to sh 28983 1726882976.70404: variable 'ansible_shell_executable' from source: unknown 28983 1726882976.70407: variable 'ansible_connection' from source: unknown 28983 1726882976.70410: variable 'ansible_module_compression' from source: unknown 28983 1726882976.70412: variable 'ansible_shell_type' from source: unknown 28983 1726882976.70414: variable 'ansible_shell_executable' from source: unknown 28983 1726882976.70416: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882976.70418: variable 'ansible_pipelining' from source: unknown 28983 1726882976.70420: variable 'ansible_timeout' from source: unknown 28983 1726882976.70424: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882976.70655: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28983 1726882976.70673: variable 'omit' from source: magic vars 28983 1726882976.70686: starting attempt loop 28983 1726882976.70694: running the handler 28983 1726882976.70718: _low_level_execute_command(): starting 28983 1726882976.70732: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726882976.71608: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882976.71648: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726882976.71669: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726882976.71704: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882976.71826: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882976.73602: stdout chunk (state=3): >>>/root <<< 28983 1726882976.73799: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882976.73802: stdout chunk (state=3): >>><<< 28983 1726882976.73805: stderr chunk (state=3): >>><<< 28983 1726882976.73830: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726882976.73940: _low_level_execute_command(): starting 28983 1726882976.73944: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882976.7384005-29272-3092308249455 `" && echo ansible-tmp-1726882976.7384005-29272-3092308249455="` echo /root/.ansible/tmp/ansible-tmp-1726882976.7384005-29272-3092308249455 `" ) && sleep 0' 28983 1726882976.74543: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726882976.74597: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882976.74669: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882976.76708: stdout chunk (state=3): >>>ansible-tmp-1726882976.7384005-29272-3092308249455=/root/.ansible/tmp/ansible-tmp-1726882976.7384005-29272-3092308249455 <<< 28983 1726882976.76914: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882976.76919: stdout chunk (state=3): >>><<< 28983 1726882976.76922: stderr chunk (state=3): >>><<< 28983 1726882976.77141: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882976.7384005-29272-3092308249455=/root/.ansible/tmp/ansible-tmp-1726882976.7384005-29272-3092308249455 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726882976.77145: variable 'ansible_module_compression' from source: unknown 28983 1726882976.77148: ANSIBALLZ: Using lock for service_facts 28983 1726882976.77150: ANSIBALLZ: Acquiring lock 28983 1726882976.77151: ANSIBALLZ: Lock acquired: 140284030452608 28983 1726882976.77153: ANSIBALLZ: Creating module 28983 1726882976.92999: ANSIBALLZ: Writing module into payload 28983 1726882976.93083: ANSIBALLZ: Writing module 28983 1726882976.93102: ANSIBALLZ: Renaming module 28983 1726882976.93108: ANSIBALLZ: Done creating module 28983 1726882976.93122: variable 'ansible_facts' from source: unknown 28983 1726882976.93176: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882976.7384005-29272-3092308249455/AnsiballZ_service_facts.py 28983 1726882976.93297: Sending initial data 28983 1726882976.93301: Sent initial data (160 bytes) 28983 1726882976.93733: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882976.93770: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726882976.93774: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 28983 1726882976.93777: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726882976.93779: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882976.93830: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726882976.93837: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882976.93916: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882976.95668: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 28983 1726882976.95673: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726882976.95735: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726882976.95807: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmp2hjge9aw /root/.ansible/tmp/ansible-tmp-1726882976.7384005-29272-3092308249455/AnsiballZ_service_facts.py <<< 28983 1726882976.95813: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882976.7384005-29272-3092308249455/AnsiballZ_service_facts.py" <<< 28983 1726882976.95875: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmp2hjge9aw" to remote "/root/.ansible/tmp/ansible-tmp-1726882976.7384005-29272-3092308249455/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882976.7384005-29272-3092308249455/AnsiballZ_service_facts.py" <<< 28983 1726882976.96809: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882976.96873: stderr chunk (state=3): >>><<< 28983 1726882976.96877: stdout chunk (state=3): >>><<< 28983 1726882976.96899: done transferring module to remote 28983 1726882976.96908: _low_level_execute_command(): starting 28983 1726882976.96913: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882976.7384005-29272-3092308249455/ /root/.ansible/tmp/ansible-tmp-1726882976.7384005-29272-3092308249455/AnsiballZ_service_facts.py && sleep 0' 28983 1726882976.97542: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726882976.97546: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28983 1726882976.97549: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found <<< 28983 1726882976.97551: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882976.97601: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882976.97664: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882976.99554: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882976.99601: stderr chunk (state=3): >>><<< 28983 1726882976.99605: stdout chunk (state=3): >>><<< 28983 1726882976.99623: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726882976.99626: _low_level_execute_command(): starting 28983 1726882976.99631: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882976.7384005-29272-3092308249455/AnsiballZ_service_facts.py && sleep 0' 28983 1726882977.00084: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882977.00088: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882977.00091: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882977.00093: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882977.00139: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726882977.00154: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882977.00227: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882978.99155: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"network": {"name": "network", "state": "running", "status": "enabled", "source": "sysv"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "import-state.service": {"name": "import-state.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "ip6tables.service": {"name": "ip6tables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "iptables.service": {"name": "iptables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "generated", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "loadmodules.service": {"name": "loadmodules.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "st<<< 28983 1726882978.99172: stdout chunk (state=3): >>>atic", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 28983 1726882979.01059: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. <<< 28983 1726882979.01069: stderr chunk (state=3): >>><<< 28983 1726882979.01075: stdout chunk (state=3): >>><<< 28983 1726882979.01163: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"network": {"name": "network", "state": "running", "status": "enabled", "source": "sysv"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "import-state.service": {"name": "import-state.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "ip6tables.service": {"name": "ip6tables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "iptables.service": {"name": "iptables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "generated", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "loadmodules.service": {"name": "loadmodules.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726882979.03354: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882976.7384005-29272-3092308249455/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726882979.03365: _low_level_execute_command(): starting 28983 1726882979.03372: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882976.7384005-29272-3092308249455/ > /dev/null 2>&1 && sleep 0' 28983 1726882979.04940: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882979.04944: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726882979.04947: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882979.04949: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28983 1726882979.04951: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found <<< 28983 1726882979.04954: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882979.05150: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726882979.05157: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882979.05239: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882979.07315: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882979.07319: stderr chunk (state=3): >>><<< 28983 1726882979.07321: stdout chunk (state=3): >>><<< 28983 1726882979.07383: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726882979.07387: handler run complete 28983 1726882979.08000: variable 'ansible_facts' from source: unknown 28983 1726882979.08599: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882979.10253: variable 'ansible_facts' from source: unknown 28983 1726882979.10598: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882979.11285: attempt loop complete, returning result 28983 1726882979.11289: _execute() done 28983 1726882979.11296: dumping result to json 28983 1726882979.11584: done dumping result, returning 28983 1726882979.11592: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running [0affe814-3a2d-b16d-c0a7-00000000026e] 28983 1726882979.11598: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000026e 28983 1726882979.13694: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000026e 28983 1726882979.13697: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28983 1726882979.13803: no more pending results, returning what we have 28983 1726882979.13806: results queue empty 28983 1726882979.13807: checking for any_errors_fatal 28983 1726882979.13812: done checking for any_errors_fatal 28983 1726882979.13813: checking for max_fail_percentage 28983 1726882979.13815: done checking for max_fail_percentage 28983 1726882979.13816: checking to see if all hosts have failed and the running result is not ok 28983 1726882979.13817: done checking to see if all hosts have failed 28983 1726882979.13818: getting the remaining hosts for this loop 28983 1726882979.13820: done getting the remaining hosts for this loop 28983 1726882979.13824: getting the next task for host managed_node2 28983 1726882979.13831: done getting next task for host managed_node2 28983 1726882979.13837: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 28983 1726882979.13845: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882979.13855: getting variables 28983 1726882979.13857: in VariableManager get_vars() 28983 1726882979.13887: Calling all_inventory to load vars for managed_node2 28983 1726882979.13890: Calling groups_inventory to load vars for managed_node2 28983 1726882979.13893: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882979.13902: Calling all_plugins_play to load vars for managed_node2 28983 1726882979.13905: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882979.13909: Calling groups_plugins_play to load vars for managed_node2 28983 1726882979.14965: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882979.16577: done with get_vars() 28983 1726882979.16595: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:42:59 -0400 (0:00:02.486) 0:00:09.166 ****** 28983 1726882979.16918: entering _queue_task() for managed_node2/package_facts 28983 1726882979.16920: Creating lock for package_facts 28983 1726882979.17623: worker is 1 (out of 1 available) 28983 1726882979.17839: exiting _queue_task() for managed_node2/package_facts 28983 1726882979.17851: done queuing things up, now waiting for results queue to drain 28983 1726882979.17853: waiting for pending results... 28983 1726882979.18284: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 28983 1726882979.18641: in run() - task 0affe814-3a2d-b16d-c0a7-00000000026f 28983 1726882979.18645: variable 'ansible_search_path' from source: unknown 28983 1726882979.18648: variable 'ansible_search_path' from source: unknown 28983 1726882979.18651: calling self._execute() 28983 1726882979.18821: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882979.18829: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882979.18885: variable 'omit' from source: magic vars 28983 1726882979.19729: variable 'ansible_distribution_major_version' from source: facts 28983 1726882979.19848: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882979.19861: variable 'omit' from source: magic vars 28983 1726882979.20099: variable 'omit' from source: magic vars 28983 1726882979.20141: variable 'omit' from source: magic vars 28983 1726882979.20440: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726882979.20444: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726882979.20447: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726882979.20449: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882979.20460: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882979.20500: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726882979.20730: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882979.20733: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882979.20865: Set connection var ansible_connection to ssh 28983 1726882979.20884: Set connection var ansible_shell_executable to /bin/sh 28983 1726882979.20903: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726882979.20919: Set connection var ansible_timeout to 10 28983 1726882979.20932: Set connection var ansible_pipelining to False 28983 1726882979.20943: Set connection var ansible_shell_type to sh 28983 1726882979.21139: variable 'ansible_shell_executable' from source: unknown 28983 1726882979.21143: variable 'ansible_connection' from source: unknown 28983 1726882979.21146: variable 'ansible_module_compression' from source: unknown 28983 1726882979.21148: variable 'ansible_shell_type' from source: unknown 28983 1726882979.21150: variable 'ansible_shell_executable' from source: unknown 28983 1726882979.21152: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882979.21154: variable 'ansible_pipelining' from source: unknown 28983 1726882979.21156: variable 'ansible_timeout' from source: unknown 28983 1726882979.21159: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882979.21555: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28983 1726882979.21574: variable 'omit' from source: magic vars 28983 1726882979.21587: starting attempt loop 28983 1726882979.21610: running the handler 28983 1726882979.21822: _low_level_execute_command(): starting 28983 1726882979.21825: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726882979.23225: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882979.23229: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882979.23232: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882979.23237: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882979.23363: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882979.23404: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882979.25173: stdout chunk (state=3): >>>/root <<< 28983 1726882979.25283: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882979.25360: stderr chunk (state=3): >>><<< 28983 1726882979.25640: stdout chunk (state=3): >>><<< 28983 1726882979.25646: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726882979.25649: _low_level_execute_command(): starting 28983 1726882979.25653: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882979.2544203-29355-226507605235804 `" && echo ansible-tmp-1726882979.2544203-29355-226507605235804="` echo /root/.ansible/tmp/ansible-tmp-1726882979.2544203-29355-226507605235804 `" ) && sleep 0' 28983 1726882979.26812: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726882979.26855: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882979.27050: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726882979.27070: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882979.27181: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882979.29261: stdout chunk (state=3): >>>ansible-tmp-1726882979.2544203-29355-226507605235804=/root/.ansible/tmp/ansible-tmp-1726882979.2544203-29355-226507605235804 <<< 28983 1726882979.29377: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882979.29437: stderr chunk (state=3): >>><<< 28983 1726882979.29448: stdout chunk (state=3): >>><<< 28983 1726882979.29473: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882979.2544203-29355-226507605235804=/root/.ansible/tmp/ansible-tmp-1726882979.2544203-29355-226507605235804 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726882979.29585: variable 'ansible_module_compression' from source: unknown 28983 1726882979.29696: ANSIBALLZ: Using lock for package_facts 28983 1726882979.29787: ANSIBALLZ: Acquiring lock 28983 1726882979.29790: ANSIBALLZ: Lock acquired: 140284029685056 28983 1726882979.29792: ANSIBALLZ: Creating module 28983 1726882980.07850: ANSIBALLZ: Writing module into payload 28983 1726882980.08126: ANSIBALLZ: Writing module 28983 1726882980.08188: ANSIBALLZ: Renaming module 28983 1726882980.08239: ANSIBALLZ: Done creating module 28983 1726882980.08243: variable 'ansible_facts' from source: unknown 28983 1726882980.08794: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882979.2544203-29355-226507605235804/AnsiballZ_package_facts.py 28983 1726882980.08920: Sending initial data 28983 1726882980.08930: Sent initial data (162 bytes) 28983 1726882980.10439: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882980.10546: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726882980.10657: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726882980.10762: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882980.10866: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882980.12625: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 28983 1726882980.12643: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726882980.12694: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726882980.12843: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpt6b3emqd /root/.ansible/tmp/ansible-tmp-1726882979.2544203-29355-226507605235804/AnsiballZ_package_facts.py <<< 28983 1726882980.12848: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882979.2544203-29355-226507605235804/AnsiballZ_package_facts.py" <<< 28983 1726882980.12856: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpt6b3emqd" to remote "/root/.ansible/tmp/ansible-tmp-1726882979.2544203-29355-226507605235804/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882979.2544203-29355-226507605235804/AnsiballZ_package_facts.py" <<< 28983 1726882980.18850: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882980.19050: stderr chunk (state=3): >>><<< 28983 1726882980.19053: stdout chunk (state=3): >>><<< 28983 1726882980.19059: done transferring module to remote 28983 1726882980.19077: _low_level_execute_command(): starting 28983 1726882980.19128: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882979.2544203-29355-226507605235804/ /root/.ansible/tmp/ansible-tmp-1726882979.2544203-29355-226507605235804/AnsiballZ_package_facts.py && sleep 0' 28983 1726882980.20338: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882980.20444: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882980.20485: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28983 1726882980.20489: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882980.20564: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726882980.20567: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882980.20703: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882980.22639: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882980.22713: stderr chunk (state=3): >>><<< 28983 1726882980.22928: stdout chunk (state=3): >>><<< 28983 1726882980.22932: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726882980.22937: _low_level_execute_command(): starting 28983 1726882980.22939: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882979.2544203-29355-226507605235804/AnsiballZ_package_facts.py && sleep 0' 28983 1726882980.24254: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882980.24452: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726882980.24477: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882980.24480: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882980.88113: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "a<<< 28983 1726882980.88118: stdout chunk (state=3): >>>rch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "r<<< 28983 1726882980.88132: stdout chunk (state=3): >>>pm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": <<< 28983 1726882980.88212: stdout chunk (state=3): >>>"rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, <<< 28983 1726882980.88227: stdout chunk (state=3): >>>"arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release"<<< 28983 1726882980.88230: stdout chunk (state=3): >>>: "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils",<<< 28983 1726882980.88236: stdout chunk (state=3): >>> "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "<<< 28983 1726882980.88257: stdout chunk (state=3): >>>version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", <<< 28983 1726882980.88304: stdout chunk (state=3): >>>"release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "sou<<< 28983 1726882980.88310: stdout chunk (state=3): >>>rce": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "a<<< 28983 1726882980.88324: stdout chunk (state=3): >>>spell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch": "n<<< 28983 1726882980.88390: stdout chunk (state=3): >>>oarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": null, "a<<< 28983 1726882980.88396: stdout chunk (state=3): >>>rch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "9.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chkconfig": [{"name": "chkconfig", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts": [{"name": "initscripts", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "network-scripts": [{"name": "network-scripts", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 28983 1726882980.90215: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. <<< 28983 1726882980.90273: stderr chunk (state=3): >>><<< 28983 1726882980.90277: stdout chunk (state=3): >>><<< 28983 1726882980.90317: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "9.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chkconfig": [{"name": "chkconfig", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts": [{"name": "initscripts", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "network-scripts": [{"name": "network-scripts", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726882980.94755: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882979.2544203-29355-226507605235804/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726882980.94760: _low_level_execute_command(): starting 28983 1726882980.94762: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882979.2544203-29355-226507605235804/ > /dev/null 2>&1 && sleep 0' 28983 1726882980.95154: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726882980.95224: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882980.95271: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726882980.95286: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726882980.95303: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882980.95410: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882980.97469: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882980.97494: stdout chunk (state=3): >>><<< 28983 1726882980.97527: stderr chunk (state=3): >>><<< 28983 1726882980.97548: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726882980.97739: handler run complete 28983 1726882981.00685: variable 'ansible_facts' from source: unknown 28983 1726882981.02619: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882981.07521: variable 'ansible_facts' from source: unknown 28983 1726882981.08299: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882981.11973: attempt loop complete, returning result 28983 1726882981.12050: _execute() done 28983 1726882981.12112: dumping result to json 28983 1726882981.12521: done dumping result, returning 28983 1726882981.12576: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affe814-3a2d-b16d-c0a7-00000000026f] 28983 1726882981.12588: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000026f 28983 1726882981.23207: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000026f 28983 1726882981.23211: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28983 1726882981.23286: no more pending results, returning what we have 28983 1726882981.23292: results queue empty 28983 1726882981.23293: checking for any_errors_fatal 28983 1726882981.23297: done checking for any_errors_fatal 28983 1726882981.23298: checking for max_fail_percentage 28983 1726882981.23300: done checking for max_fail_percentage 28983 1726882981.23301: checking to see if all hosts have failed and the running result is not ok 28983 1726882981.23302: done checking to see if all hosts have failed 28983 1726882981.23303: getting the remaining hosts for this loop 28983 1726882981.23306: done getting the remaining hosts for this loop 28983 1726882981.23311: getting the next task for host managed_node2 28983 1726882981.23319: done getting next task for host managed_node2 28983 1726882981.23326: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 28983 1726882981.23335: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882981.23552: getting variables 28983 1726882981.23555: in VariableManager get_vars() 28983 1726882981.23586: Calling all_inventory to load vars for managed_node2 28983 1726882981.23589: Calling groups_inventory to load vars for managed_node2 28983 1726882981.23592: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882981.23605: Calling all_plugins_play to load vars for managed_node2 28983 1726882981.23609: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882981.23613: Calling groups_plugins_play to load vars for managed_node2 28983 1726882981.29325: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882981.36306: done with get_vars() 28983 1726882981.36418: done getting variables 28983 1726882981.36688: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:43:01 -0400 (0:00:02.198) 0:00:11.365 ****** 28983 1726882981.36742: entering _queue_task() for managed_node2/debug 28983 1726882981.37959: worker is 1 (out of 1 available) 28983 1726882981.37972: exiting _queue_task() for managed_node2/debug 28983 1726882981.37984: done queuing things up, now waiting for results queue to drain 28983 1726882981.37986: waiting for pending results... 28983 1726882981.38461: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 28983 1726882981.38756: in run() - task 0affe814-3a2d-b16d-c0a7-00000000020d 28983 1726882981.38818: variable 'ansible_search_path' from source: unknown 28983 1726882981.39015: variable 'ansible_search_path' from source: unknown 28983 1726882981.39019: calling self._execute() 28983 1726882981.39165: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882981.39181: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882981.39198: variable 'omit' from source: magic vars 28983 1726882981.40380: variable 'ansible_distribution_major_version' from source: facts 28983 1726882981.40488: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882981.40492: variable 'omit' from source: magic vars 28983 1726882981.40557: variable 'omit' from source: magic vars 28983 1726882981.40952: variable 'network_provider' from source: set_fact 28983 1726882981.41070: variable 'omit' from source: magic vars 28983 1726882981.41318: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726882981.41394: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726882981.41470: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726882981.41612: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882981.41809: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882981.41816: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726882981.41819: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882981.41822: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882981.42008: Set connection var ansible_connection to ssh 28983 1726882981.42151: Set connection var ansible_shell_executable to /bin/sh 28983 1726882981.42167: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726882981.42183: Set connection var ansible_timeout to 10 28983 1726882981.42194: Set connection var ansible_pipelining to False 28983 1726882981.42333: Set connection var ansible_shell_type to sh 28983 1726882981.42338: variable 'ansible_shell_executable' from source: unknown 28983 1726882981.42341: variable 'ansible_connection' from source: unknown 28983 1726882981.42343: variable 'ansible_module_compression' from source: unknown 28983 1726882981.42347: variable 'ansible_shell_type' from source: unknown 28983 1726882981.42349: variable 'ansible_shell_executable' from source: unknown 28983 1726882981.42351: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882981.42353: variable 'ansible_pipelining' from source: unknown 28983 1726882981.42356: variable 'ansible_timeout' from source: unknown 28983 1726882981.42358: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882981.43020: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726882981.43024: variable 'omit' from source: magic vars 28983 1726882981.43026: starting attempt loop 28983 1726882981.43028: running the handler 28983 1726882981.43031: handler run complete 28983 1726882981.43128: attempt loop complete, returning result 28983 1726882981.43132: _execute() done 28983 1726882981.43137: dumping result to json 28983 1726882981.43139: done dumping result, returning 28983 1726882981.43142: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [0affe814-3a2d-b16d-c0a7-00000000020d] 28983 1726882981.43144: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000020d 28983 1726882981.43221: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000020d 28983 1726882981.43225: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: Using network provider: nm 28983 1726882981.43302: no more pending results, returning what we have 28983 1726882981.43306: results queue empty 28983 1726882981.43307: checking for any_errors_fatal 28983 1726882981.43320: done checking for any_errors_fatal 28983 1726882981.43321: checking for max_fail_percentage 28983 1726882981.43322: done checking for max_fail_percentage 28983 1726882981.43324: checking to see if all hosts have failed and the running result is not ok 28983 1726882981.43325: done checking to see if all hosts have failed 28983 1726882981.43326: getting the remaining hosts for this loop 28983 1726882981.43328: done getting the remaining hosts for this loop 28983 1726882981.43335: getting the next task for host managed_node2 28983 1726882981.43344: done getting next task for host managed_node2 28983 1726882981.43349: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 28983 1726882981.43356: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882981.43369: getting variables 28983 1726882981.43371: in VariableManager get_vars() 28983 1726882981.43413: Calling all_inventory to load vars for managed_node2 28983 1726882981.43417: Calling groups_inventory to load vars for managed_node2 28983 1726882981.43420: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882981.43430: Calling all_plugins_play to load vars for managed_node2 28983 1726882981.43927: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882981.43938: Calling groups_plugins_play to load vars for managed_node2 28983 1726882981.48618: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882981.56320: done with get_vars() 28983 1726882981.56492: done getting variables 28983 1726882981.56643: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:43:01 -0400 (0:00:00.200) 0:00:11.565 ****** 28983 1726882981.56807: entering _queue_task() for managed_node2/fail 28983 1726882981.56809: Creating lock for fail 28983 1726882981.57465: worker is 1 (out of 1 available) 28983 1726882981.57479: exiting _queue_task() for managed_node2/fail 28983 1726882981.57494: done queuing things up, now waiting for results queue to drain 28983 1726882981.57496: waiting for pending results... 28983 1726882981.58188: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 28983 1726882981.58714: in run() - task 0affe814-3a2d-b16d-c0a7-00000000020e 28983 1726882981.58847: variable 'ansible_search_path' from source: unknown 28983 1726882981.58850: variable 'ansible_search_path' from source: unknown 28983 1726882981.58899: calling self._execute() 28983 1726882981.59440: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882981.59444: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882981.59447: variable 'omit' from source: magic vars 28983 1726882981.60344: variable 'ansible_distribution_major_version' from source: facts 28983 1726882981.60348: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882981.60352: variable 'network_state' from source: role '' defaults 28983 1726882981.60357: Evaluated conditional (network_state != {}): False 28983 1726882981.60360: when evaluation is False, skipping this task 28983 1726882981.60366: _execute() done 28983 1726882981.60369: dumping result to json 28983 1726882981.60382: done dumping result, returning 28983 1726882981.60390: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affe814-3a2d-b16d-c0a7-00000000020e] 28983 1726882981.60403: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000020e 28983 1726882981.60518: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000020e 28983 1726882981.60521: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28983 1726882981.60584: no more pending results, returning what we have 28983 1726882981.60588: results queue empty 28983 1726882981.60589: checking for any_errors_fatal 28983 1726882981.60596: done checking for any_errors_fatal 28983 1726882981.60597: checking for max_fail_percentage 28983 1726882981.60599: done checking for max_fail_percentage 28983 1726882981.60600: checking to see if all hosts have failed and the running result is not ok 28983 1726882981.60601: done checking to see if all hosts have failed 28983 1726882981.60602: getting the remaining hosts for this loop 28983 1726882981.60604: done getting the remaining hosts for this loop 28983 1726882981.60608: getting the next task for host managed_node2 28983 1726882981.60617: done getting next task for host managed_node2 28983 1726882981.60621: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 28983 1726882981.60627: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882981.60646: getting variables 28983 1726882981.60648: in VariableManager get_vars() 28983 1726882981.60686: Calling all_inventory to load vars for managed_node2 28983 1726882981.60689: Calling groups_inventory to load vars for managed_node2 28983 1726882981.60691: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882981.60701: Calling all_plugins_play to load vars for managed_node2 28983 1726882981.60704: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882981.60708: Calling groups_plugins_play to load vars for managed_node2 28983 1726882981.65305: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882981.68425: done with get_vars() 28983 1726882981.68475: done getting variables 28983 1726882981.68552: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:43:01 -0400 (0:00:00.117) 0:00:11.683 ****** 28983 1726882981.68597: entering _queue_task() for managed_node2/fail 28983 1726882981.69207: worker is 1 (out of 1 available) 28983 1726882981.69223: exiting _queue_task() for managed_node2/fail 28983 1726882981.69365: done queuing things up, now waiting for results queue to drain 28983 1726882981.69368: waiting for pending results... 28983 1726882981.69670: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 28983 1726882981.70091: in run() - task 0affe814-3a2d-b16d-c0a7-00000000020f 28983 1726882981.70095: variable 'ansible_search_path' from source: unknown 28983 1726882981.70099: variable 'ansible_search_path' from source: unknown 28983 1726882981.70102: calling self._execute() 28983 1726882981.70417: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882981.70421: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882981.70425: variable 'omit' from source: magic vars 28983 1726882981.71370: variable 'ansible_distribution_major_version' from source: facts 28983 1726882981.71431: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882981.71716: variable 'network_state' from source: role '' defaults 28983 1726882981.71745: Evaluated conditional (network_state != {}): False 28983 1726882981.71753: when evaluation is False, skipping this task 28983 1726882981.71761: _execute() done 28983 1726882981.71770: dumping result to json 28983 1726882981.71781: done dumping result, returning 28983 1726882981.71793: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affe814-3a2d-b16d-c0a7-00000000020f] 28983 1726882981.71806: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000020f 28983 1726882981.71948: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000020f skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28983 1726882981.72010: no more pending results, returning what we have 28983 1726882981.72015: results queue empty 28983 1726882981.72016: checking for any_errors_fatal 28983 1726882981.72023: done checking for any_errors_fatal 28983 1726882981.72025: checking for max_fail_percentage 28983 1726882981.72027: done checking for max_fail_percentage 28983 1726882981.72028: checking to see if all hosts have failed and the running result is not ok 28983 1726882981.72029: done checking to see if all hosts have failed 28983 1726882981.72030: getting the remaining hosts for this loop 28983 1726882981.72033: done getting the remaining hosts for this loop 28983 1726882981.72040: getting the next task for host managed_node2 28983 1726882981.72050: done getting next task for host managed_node2 28983 1726882981.72055: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 28983 1726882981.72062: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882981.72087: getting variables 28983 1726882981.72089: in VariableManager get_vars() 28983 1726882981.72238: Calling all_inventory to load vars for managed_node2 28983 1726882981.72242: Calling groups_inventory to load vars for managed_node2 28983 1726882981.72246: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882981.72253: WORKER PROCESS EXITING 28983 1726882981.72265: Calling all_plugins_play to load vars for managed_node2 28983 1726882981.72269: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882981.72275: Calling groups_plugins_play to load vars for managed_node2 28983 1726882981.75110: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882981.78543: done with get_vars() 28983 1726882981.78588: done getting variables 28983 1726882981.78661: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:43:01 -0400 (0:00:00.101) 0:00:11.784 ****** 28983 1726882981.78708: entering _queue_task() for managed_node2/fail 28983 1726882981.79071: worker is 1 (out of 1 available) 28983 1726882981.79090: exiting _queue_task() for managed_node2/fail 28983 1726882981.79104: done queuing things up, now waiting for results queue to drain 28983 1726882981.79106: waiting for pending results... 28983 1726882981.79344: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 28983 1726882981.79516: in run() - task 0affe814-3a2d-b16d-c0a7-000000000210 28983 1726882981.79532: variable 'ansible_search_path' from source: unknown 28983 1726882981.79538: variable 'ansible_search_path' from source: unknown 28983 1726882981.79590: calling self._execute() 28983 1726882981.79696: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882981.79705: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882981.79720: variable 'omit' from source: magic vars 28983 1726882981.80170: variable 'ansible_distribution_major_version' from source: facts 28983 1726882981.80183: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882981.80432: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726882981.83613: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726882981.83700: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726882981.83744: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726882981.83807: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726882981.83838: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726882981.84140: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726882981.84143: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726882981.84146: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726882981.84148: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726882981.84350: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726882981.84483: variable 'ansible_distribution_major_version' from source: facts 28983 1726882981.84486: Evaluated conditional (ansible_distribution_major_version | int > 9): True 28983 1726882981.84739: variable 'ansible_distribution' from source: facts 28983 1726882981.84886: variable '__network_rh_distros' from source: role '' defaults 28983 1726882981.84898: Evaluated conditional (ansible_distribution in __network_rh_distros): False 28983 1726882981.84902: when evaluation is False, skipping this task 28983 1726882981.84905: _execute() done 28983 1726882981.84909: dumping result to json 28983 1726882981.84914: done dumping result, returning 28983 1726882981.84923: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affe814-3a2d-b16d-c0a7-000000000210] 28983 1726882981.84929: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000210 28983 1726882981.85341: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000210 28983 1726882981.85345: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution in __network_rh_distros", "skip_reason": "Conditional result was False" } 28983 1726882981.85405: no more pending results, returning what we have 28983 1726882981.85409: results queue empty 28983 1726882981.85411: checking for any_errors_fatal 28983 1726882981.85418: done checking for any_errors_fatal 28983 1726882981.85419: checking for max_fail_percentage 28983 1726882981.85421: done checking for max_fail_percentage 28983 1726882981.85422: checking to see if all hosts have failed and the running result is not ok 28983 1726882981.85423: done checking to see if all hosts have failed 28983 1726882981.85424: getting the remaining hosts for this loop 28983 1726882981.85426: done getting the remaining hosts for this loop 28983 1726882981.85432: getting the next task for host managed_node2 28983 1726882981.85442: done getting next task for host managed_node2 28983 1726882981.85447: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 28983 1726882981.85452: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882981.85467: getting variables 28983 1726882981.85469: in VariableManager get_vars() 28983 1726882981.85508: Calling all_inventory to load vars for managed_node2 28983 1726882981.85511: Calling groups_inventory to load vars for managed_node2 28983 1726882981.85514: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882981.85523: Calling all_plugins_play to load vars for managed_node2 28983 1726882981.85526: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882981.85530: Calling groups_plugins_play to load vars for managed_node2 28983 1726882981.90108: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882981.92715: done with get_vars() 28983 1726882981.92745: done getting variables 28983 1726882981.92837: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:43:01 -0400 (0:00:00.141) 0:00:11.926 ****** 28983 1726882981.92864: entering _queue_task() for managed_node2/dnf 28983 1726882981.93126: worker is 1 (out of 1 available) 28983 1726882981.93142: exiting _queue_task() for managed_node2/dnf 28983 1726882981.93154: done queuing things up, now waiting for results queue to drain 28983 1726882981.93156: waiting for pending results... 28983 1726882981.93347: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 28983 1726882981.93444: in run() - task 0affe814-3a2d-b16d-c0a7-000000000211 28983 1726882981.93458: variable 'ansible_search_path' from source: unknown 28983 1726882981.93462: variable 'ansible_search_path' from source: unknown 28983 1726882981.93499: calling self._execute() 28983 1726882981.93575: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882981.93579: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882981.93589: variable 'omit' from source: magic vars 28983 1726882981.93930: variable 'ansible_distribution_major_version' from source: facts 28983 1726882981.93955: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882981.94311: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726882981.97376: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726882981.97380: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726882981.97537: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726882981.97542: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726882981.97545: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726882981.97790: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726882981.97798: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726882981.97829: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726882981.97903: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726882981.98121: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726882981.98164: variable 'ansible_distribution' from source: facts 28983 1726882981.98169: variable 'ansible_distribution_major_version' from source: facts 28983 1726882981.98180: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 28983 1726882981.98440: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726882981.98735: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726882981.98920: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726882981.98952: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726882981.99014: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726882981.99025: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726882981.99082: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726882981.99111: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726882981.99156: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726882981.99211: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726882981.99237: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726882981.99286: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726882981.99324: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726882981.99356: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726882981.99407: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726882981.99424: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726882981.99631: variable 'network_connections' from source: include params 28983 1726882981.99644: variable 'interface' from source: play vars 28983 1726882981.99731: variable 'interface' from source: play vars 28983 1726882981.99818: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726882982.00019: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726882982.00108: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726882982.00111: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726882982.00137: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726882982.00190: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726882982.00216: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726882982.00248: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726882982.00284: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726882982.00350: variable '__network_team_connections_defined' from source: role '' defaults 28983 1726882982.00711: variable 'network_connections' from source: include params 28983 1726882982.00714: variable 'interface' from source: play vars 28983 1726882982.00771: variable 'interface' from source: play vars 28983 1726882982.00819: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 28983 1726882982.00823: when evaluation is False, skipping this task 28983 1726882982.00826: _execute() done 28983 1726882982.00828: dumping result to json 28983 1726882982.00830: done dumping result, returning 28983 1726882982.00869: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affe814-3a2d-b16d-c0a7-000000000211] 28983 1726882982.00872: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000211 28983 1726882982.01161: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000211 28983 1726882982.01164: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 28983 1726882982.01213: no more pending results, returning what we have 28983 1726882982.01216: results queue empty 28983 1726882982.01218: checking for any_errors_fatal 28983 1726882982.01225: done checking for any_errors_fatal 28983 1726882982.01226: checking for max_fail_percentage 28983 1726882982.01228: done checking for max_fail_percentage 28983 1726882982.01229: checking to see if all hosts have failed and the running result is not ok 28983 1726882982.01230: done checking to see if all hosts have failed 28983 1726882982.01231: getting the remaining hosts for this loop 28983 1726882982.01233: done getting the remaining hosts for this loop 28983 1726882982.01243: getting the next task for host managed_node2 28983 1726882982.01251: done getting next task for host managed_node2 28983 1726882982.01256: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 28983 1726882982.01261: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882982.01282: getting variables 28983 1726882982.01284: in VariableManager get_vars() 28983 1726882982.01320: Calling all_inventory to load vars for managed_node2 28983 1726882982.01323: Calling groups_inventory to load vars for managed_node2 28983 1726882982.01326: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882982.01337: Calling all_plugins_play to load vars for managed_node2 28983 1726882982.01340: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882982.01348: Calling groups_plugins_play to load vars for managed_node2 28983 1726882982.05917: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882982.12401: done with get_vars() 28983 1726882982.12447: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 28983 1726882982.12541: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:43:02 -0400 (0:00:00.197) 0:00:12.123 ****** 28983 1726882982.12587: entering _queue_task() for managed_node2/yum 28983 1726882982.12590: Creating lock for yum 28983 1726882982.12968: worker is 1 (out of 1 available) 28983 1726882982.12987: exiting _queue_task() for managed_node2/yum 28983 1726882982.13109: done queuing things up, now waiting for results queue to drain 28983 1726882982.13112: waiting for pending results... 28983 1726882982.13759: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 28983 1726882982.13801: in run() - task 0affe814-3a2d-b16d-c0a7-000000000212 28983 1726882982.13816: variable 'ansible_search_path' from source: unknown 28983 1726882982.13825: variable 'ansible_search_path' from source: unknown 28983 1726882982.13867: calling self._execute() 28983 1726882982.14164: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882982.14173: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882982.14239: variable 'omit' from source: magic vars 28983 1726882982.14789: variable 'ansible_distribution_major_version' from source: facts 28983 1726882982.14802: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882982.15028: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726882982.17758: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726882982.17864: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726882982.17909: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726882982.17952: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726882982.17985: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726882982.18258: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726882982.18263: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726882982.18265: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726882982.18268: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726882982.18270: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726882982.18368: variable 'ansible_distribution_major_version' from source: facts 28983 1726882982.18371: Evaluated conditional (ansible_distribution_major_version | int < 8): False 28983 1726882982.18374: when evaluation is False, skipping this task 28983 1726882982.18380: _execute() done 28983 1726882982.18383: dumping result to json 28983 1726882982.18388: done dumping result, returning 28983 1726882982.18404: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affe814-3a2d-b16d-c0a7-000000000212] 28983 1726882982.18408: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000212 skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 28983 1726882982.18743: no more pending results, returning what we have 28983 1726882982.18747: results queue empty 28983 1726882982.18748: checking for any_errors_fatal 28983 1726882982.18753: done checking for any_errors_fatal 28983 1726882982.18754: checking for max_fail_percentage 28983 1726882982.18756: done checking for max_fail_percentage 28983 1726882982.18757: checking to see if all hosts have failed and the running result is not ok 28983 1726882982.18758: done checking to see if all hosts have failed 28983 1726882982.18759: getting the remaining hosts for this loop 28983 1726882982.18761: done getting the remaining hosts for this loop 28983 1726882982.18764: getting the next task for host managed_node2 28983 1726882982.18772: done getting next task for host managed_node2 28983 1726882982.18776: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 28983 1726882982.18782: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882982.18798: getting variables 28983 1726882982.18799: in VariableManager get_vars() 28983 1726882982.18832: Calling all_inventory to load vars for managed_node2 28983 1726882982.18837: Calling groups_inventory to load vars for managed_node2 28983 1726882982.18840: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882982.18849: Calling all_plugins_play to load vars for managed_node2 28983 1726882982.18853: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882982.18857: Calling groups_plugins_play to load vars for managed_node2 28983 1726882982.19496: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000212 28983 1726882982.19500: WORKER PROCESS EXITING 28983 1726882982.25030: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882982.33524: done with get_vars() 28983 1726882982.33577: done getting variables 28983 1726882982.33975: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:43:02 -0400 (0:00:00.214) 0:00:12.337 ****** 28983 1726882982.34019: entering _queue_task() for managed_node2/fail 28983 1726882982.35277: worker is 1 (out of 1 available) 28983 1726882982.35293: exiting _queue_task() for managed_node2/fail 28983 1726882982.35305: done queuing things up, now waiting for results queue to drain 28983 1726882982.35308: waiting for pending results... 28983 1726882982.35522: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 28983 1726882982.35726: in run() - task 0affe814-3a2d-b16d-c0a7-000000000213 28983 1726882982.35731: variable 'ansible_search_path' from source: unknown 28983 1726882982.35736: variable 'ansible_search_path' from source: unknown 28983 1726882982.35766: calling self._execute() 28983 1726882982.36080: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882982.36085: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882982.36088: variable 'omit' from source: magic vars 28983 1726882982.36897: variable 'ansible_distribution_major_version' from source: facts 28983 1726882982.37145: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882982.37454: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726882982.38204: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726882982.45470: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726882982.45749: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726882982.45791: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726882982.53582: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726882982.53609: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726882982.53700: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726882982.53732: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726882982.53775: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726882982.53828: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726882982.53846: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726882982.53913: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726882982.53942: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726882982.53975: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726882982.54045: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726882982.54049: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726882982.54443: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726882982.54447: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726882982.54450: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726882982.54453: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726882982.54455: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726882982.54457: variable 'network_connections' from source: include params 28983 1726882982.54465: variable 'interface' from source: play vars 28983 1726882982.54552: variable 'interface' from source: play vars 28983 1726882982.54636: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726882982.55049: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726882982.55094: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726882982.55130: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726882982.55290: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726882982.55547: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726882982.55576: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726882982.55606: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726882982.55643: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726882982.56042: variable '__network_team_connections_defined' from source: role '' defaults 28983 1726882982.56677: variable 'network_connections' from source: include params 28983 1726882982.56681: variable 'interface' from source: play vars 28983 1726882982.56761: variable 'interface' from source: play vars 28983 1726882982.56781: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 28983 1726882982.56785: when evaluation is False, skipping this task 28983 1726882982.56870: _execute() done 28983 1726882982.56872: dumping result to json 28983 1726882982.56875: done dumping result, returning 28983 1726882982.56877: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affe814-3a2d-b16d-c0a7-000000000213] 28983 1726882982.56879: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000213 skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 28983 1726882982.57474: no more pending results, returning what we have 28983 1726882982.57479: results queue empty 28983 1726882982.57480: checking for any_errors_fatal 28983 1726882982.57488: done checking for any_errors_fatal 28983 1726882982.57489: checking for max_fail_percentage 28983 1726882982.57491: done checking for max_fail_percentage 28983 1726882982.57492: checking to see if all hosts have failed and the running result is not ok 28983 1726882982.57493: done checking to see if all hosts have failed 28983 1726882982.57494: getting the remaining hosts for this loop 28983 1726882982.57496: done getting the remaining hosts for this loop 28983 1726882982.57500: getting the next task for host managed_node2 28983 1726882982.57508: done getting next task for host managed_node2 28983 1726882982.57513: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 28983 1726882982.57519: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882982.57539: getting variables 28983 1726882982.57541: in VariableManager get_vars() 28983 1726882982.57909: Calling all_inventory to load vars for managed_node2 28983 1726882982.57913: Calling groups_inventory to load vars for managed_node2 28983 1726882982.57917: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882982.57928: Calling all_plugins_play to load vars for managed_node2 28983 1726882982.57931: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882982.57938: Calling groups_plugins_play to load vars for managed_node2 28983 1726882982.58551: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000213 28983 1726882982.58556: WORKER PROCESS EXITING 28983 1726882982.66211: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882982.70132: done with get_vars() 28983 1726882982.70175: done getting variables 28983 1726882982.70245: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:43:02 -0400 (0:00:00.362) 0:00:12.700 ****** 28983 1726882982.70280: entering _queue_task() for managed_node2/package 28983 1726882982.70842: worker is 1 (out of 1 available) 28983 1726882982.70855: exiting _queue_task() for managed_node2/package 28983 1726882982.70867: done queuing things up, now waiting for results queue to drain 28983 1726882982.70869: waiting for pending results... 28983 1726882982.70997: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 28983 1726882982.71164: in run() - task 0affe814-3a2d-b16d-c0a7-000000000214 28983 1726882982.71186: variable 'ansible_search_path' from source: unknown 28983 1726882982.71190: variable 'ansible_search_path' from source: unknown 28983 1726882982.71236: calling self._execute() 28983 1726882982.71349: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882982.71358: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882982.71372: variable 'omit' from source: magic vars 28983 1726882982.72242: variable 'ansible_distribution_major_version' from source: facts 28983 1726882982.72247: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882982.72580: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726882982.73225: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726882982.73229: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726882982.73279: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726882982.73319: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726882982.73643: variable 'network_packages' from source: role '' defaults 28983 1726882982.73646: variable '__network_provider_setup' from source: role '' defaults 28983 1726882982.73649: variable '__network_service_name_default_nm' from source: role '' defaults 28983 1726882982.73699: variable '__network_service_name_default_nm' from source: role '' defaults 28983 1726882982.73708: variable '__network_packages_default_nm' from source: role '' defaults 28983 1726882982.73782: variable '__network_packages_default_nm' from source: role '' defaults 28983 1726882982.74092: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726882982.77484: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726882982.77575: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726882982.77640: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726882982.77687: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726882982.77745: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726882982.77869: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726882982.77949: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726882982.77989: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726882982.78058: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726882982.78080: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726882982.78159: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726882982.78199: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726882982.78239: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726882982.78307: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726882982.78330: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726882982.78685: variable '__network_packages_default_gobject_packages' from source: role '' defaults 28983 1726882982.78914: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726882982.78918: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726882982.78931: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726882982.78993: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726882982.79022: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726882982.79150: variable 'ansible_python' from source: facts 28983 1726882982.79177: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 28983 1726882982.79289: variable '__network_wpa_supplicant_required' from source: role '' defaults 28983 1726882982.79401: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 28983 1726882982.79679: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726882982.79689: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726882982.79692: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726882982.79726: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726882982.79751: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726882982.79818: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726882982.79898: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726882982.79941: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726882982.79996: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726882982.80028: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726882982.80226: variable 'network_connections' from source: include params 28983 1726882982.80247: variable 'interface' from source: play vars 28983 1726882982.80404: variable 'interface' from source: play vars 28983 1726882982.80552: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726882982.80556: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726882982.80594: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726882982.80639: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726882982.80710: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726882982.81113: variable 'network_connections' from source: include params 28983 1726882982.81129: variable 'interface' from source: play vars 28983 1726882982.81315: variable 'interface' from source: play vars 28983 1726882982.81326: variable '__network_packages_default_wireless' from source: role '' defaults 28983 1726882982.81442: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726882982.81939: variable 'network_connections' from source: include params 28983 1726882982.81967: variable 'interface' from source: play vars 28983 1726882982.82110: variable 'interface' from source: play vars 28983 1726882982.82157: variable '__network_packages_default_team' from source: role '' defaults 28983 1726882982.82275: variable '__network_team_connections_defined' from source: role '' defaults 28983 1726882982.82744: variable 'network_connections' from source: include params 28983 1726882982.82852: variable 'interface' from source: play vars 28983 1726882982.82856: variable 'interface' from source: play vars 28983 1726882982.82939: variable '__network_service_name_default_initscripts' from source: role '' defaults 28983 1726882982.83028: variable '__network_service_name_default_initscripts' from source: role '' defaults 28983 1726882982.83046: variable '__network_packages_default_initscripts' from source: role '' defaults 28983 1726882982.83135: variable '__network_packages_default_initscripts' from source: role '' defaults 28983 1726882982.83467: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 28983 1726882982.84197: variable 'network_connections' from source: include params 28983 1726882982.84210: variable 'interface' from source: play vars 28983 1726882982.84302: variable 'interface' from source: play vars 28983 1726882982.84320: variable 'ansible_distribution' from source: facts 28983 1726882982.84330: variable '__network_rh_distros' from source: role '' defaults 28983 1726882982.84347: variable 'ansible_distribution_major_version' from source: facts 28983 1726882982.84390: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 28983 1726882982.84640: variable 'ansible_distribution' from source: facts 28983 1726882982.84651: variable '__network_rh_distros' from source: role '' defaults 28983 1726882982.84717: variable 'ansible_distribution_major_version' from source: facts 28983 1726882982.84721: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 28983 1726882982.84921: variable 'ansible_distribution' from source: facts 28983 1726882982.84942: variable '__network_rh_distros' from source: role '' defaults 28983 1726882982.84954: variable 'ansible_distribution_major_version' from source: facts 28983 1726882982.85003: variable 'network_provider' from source: set_fact 28983 1726882982.85028: variable 'ansible_facts' from source: unknown 28983 1726882982.86452: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 28983 1726882982.86463: when evaluation is False, skipping this task 28983 1726882982.86540: _execute() done 28983 1726882982.86544: dumping result to json 28983 1726882982.86546: done dumping result, returning 28983 1726882982.86549: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [0affe814-3a2d-b16d-c0a7-000000000214] 28983 1726882982.86551: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000214 skipping: [managed_node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 28983 1726882982.86689: no more pending results, returning what we have 28983 1726882982.86693: results queue empty 28983 1726882982.86694: checking for any_errors_fatal 28983 1726882982.86703: done checking for any_errors_fatal 28983 1726882982.86703: checking for max_fail_percentage 28983 1726882982.86706: done checking for max_fail_percentage 28983 1726882982.86707: checking to see if all hosts have failed and the running result is not ok 28983 1726882982.86707: done checking to see if all hosts have failed 28983 1726882982.86708: getting the remaining hosts for this loop 28983 1726882982.86710: done getting the remaining hosts for this loop 28983 1726882982.86715: getting the next task for host managed_node2 28983 1726882982.86723: done getting next task for host managed_node2 28983 1726882982.86728: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 28983 1726882982.86735: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882982.86753: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000214 28983 1726882982.86756: WORKER PROCESS EXITING 28983 1726882982.86944: getting variables 28983 1726882982.86946: in VariableManager get_vars() 28983 1726882982.86988: Calling all_inventory to load vars for managed_node2 28983 1726882982.86992: Calling groups_inventory to load vars for managed_node2 28983 1726882982.86995: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882982.87004: Calling all_plugins_play to load vars for managed_node2 28983 1726882982.87007: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882982.87010: Calling groups_plugins_play to load vars for managed_node2 28983 1726882982.89483: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882982.93295: done with get_vars() 28983 1726882982.93332: done getting variables 28983 1726882982.93466: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:43:02 -0400 (0:00:00.232) 0:00:12.933 ****** 28983 1726882982.93567: entering _queue_task() for managed_node2/package 28983 1726882982.94347: worker is 1 (out of 1 available) 28983 1726882982.94361: exiting _queue_task() for managed_node2/package 28983 1726882982.94492: done queuing things up, now waiting for results queue to drain 28983 1726882982.94494: waiting for pending results... 28983 1726882982.94882: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 28983 1726882982.95157: in run() - task 0affe814-3a2d-b16d-c0a7-000000000215 28983 1726882982.95167: variable 'ansible_search_path' from source: unknown 28983 1726882982.95170: variable 'ansible_search_path' from source: unknown 28983 1726882982.95175: calling self._execute() 28983 1726882982.95579: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882982.95582: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882982.95587: variable 'omit' from source: magic vars 28983 1726882982.96392: variable 'ansible_distribution_major_version' from source: facts 28983 1726882982.96413: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882982.96604: variable 'network_state' from source: role '' defaults 28983 1726882982.96625: Evaluated conditional (network_state != {}): False 28983 1726882982.96638: when evaluation is False, skipping this task 28983 1726882982.96757: _execute() done 28983 1726882982.96762: dumping result to json 28983 1726882982.96765: done dumping result, returning 28983 1726882982.96768: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affe814-3a2d-b16d-c0a7-000000000215] 28983 1726882982.96771: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000215 28983 1726882982.96851: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000215 28983 1726882982.96854: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28983 1726882982.96918: no more pending results, returning what we have 28983 1726882982.96923: results queue empty 28983 1726882982.96924: checking for any_errors_fatal 28983 1726882982.96936: done checking for any_errors_fatal 28983 1726882982.96937: checking for max_fail_percentage 28983 1726882982.96940: done checking for max_fail_percentage 28983 1726882982.96941: checking to see if all hosts have failed and the running result is not ok 28983 1726882982.96942: done checking to see if all hosts have failed 28983 1726882982.96943: getting the remaining hosts for this loop 28983 1726882982.96945: done getting the remaining hosts for this loop 28983 1726882982.96950: getting the next task for host managed_node2 28983 1726882982.96961: done getting next task for host managed_node2 28983 1726882982.96969: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 28983 1726882982.96979: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882982.96999: getting variables 28983 1726882982.97001: in VariableManager get_vars() 28983 1726882982.97158: Calling all_inventory to load vars for managed_node2 28983 1726882982.97162: Calling groups_inventory to load vars for managed_node2 28983 1726882982.97166: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882982.97182: Calling all_plugins_play to load vars for managed_node2 28983 1726882982.97187: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882982.97192: Calling groups_plugins_play to load vars for managed_node2 28983 1726882982.99867: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882983.03340: done with get_vars() 28983 1726882983.03388: done getting variables 28983 1726882983.03462: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:43:03 -0400 (0:00:00.099) 0:00:13.032 ****** 28983 1726882983.03512: entering _queue_task() for managed_node2/package 28983 1726882983.03977: worker is 1 (out of 1 available) 28983 1726882983.03991: exiting _queue_task() for managed_node2/package 28983 1726882983.04003: done queuing things up, now waiting for results queue to drain 28983 1726882983.04004: waiting for pending results... 28983 1726882983.04356: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 28983 1726882983.04419: in run() - task 0affe814-3a2d-b16d-c0a7-000000000216 28983 1726882983.04451: variable 'ansible_search_path' from source: unknown 28983 1726882983.04465: variable 'ansible_search_path' from source: unknown 28983 1726882983.04515: calling self._execute() 28983 1726882983.04627: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882983.04643: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882983.04670: variable 'omit' from source: magic vars 28983 1726882983.05154: variable 'ansible_distribution_major_version' from source: facts 28983 1726882983.05175: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882983.05437: variable 'network_state' from source: role '' defaults 28983 1726882983.05442: Evaluated conditional (network_state != {}): False 28983 1726882983.05445: when evaluation is False, skipping this task 28983 1726882983.05448: _execute() done 28983 1726882983.05450: dumping result to json 28983 1726882983.05453: done dumping result, returning 28983 1726882983.05455: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affe814-3a2d-b16d-c0a7-000000000216] 28983 1726882983.05458: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000216 28983 1726882983.05741: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000216 28983 1726882983.05745: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28983 1726882983.05800: no more pending results, returning what we have 28983 1726882983.05804: results queue empty 28983 1726882983.05805: checking for any_errors_fatal 28983 1726882983.05811: done checking for any_errors_fatal 28983 1726882983.05812: checking for max_fail_percentage 28983 1726882983.05814: done checking for max_fail_percentage 28983 1726882983.05815: checking to see if all hosts have failed and the running result is not ok 28983 1726882983.05816: done checking to see if all hosts have failed 28983 1726882983.05817: getting the remaining hosts for this loop 28983 1726882983.05819: done getting the remaining hosts for this loop 28983 1726882983.05824: getting the next task for host managed_node2 28983 1726882983.05832: done getting next task for host managed_node2 28983 1726882983.05839: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 28983 1726882983.05845: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882983.05862: getting variables 28983 1726882983.05863: in VariableManager get_vars() 28983 1726882983.05903: Calling all_inventory to load vars for managed_node2 28983 1726882983.05907: Calling groups_inventory to load vars for managed_node2 28983 1726882983.05909: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882983.05920: Calling all_plugins_play to load vars for managed_node2 28983 1726882983.05923: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882983.05926: Calling groups_plugins_play to load vars for managed_node2 28983 1726882983.08516: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882983.11623: done with get_vars() 28983 1726882983.11661: done getting variables 28983 1726882983.11789: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:43:03 -0400 (0:00:00.083) 0:00:13.116 ****** 28983 1726882983.11833: entering _queue_task() for managed_node2/service 28983 1726882983.11836: Creating lock for service 28983 1726882983.12197: worker is 1 (out of 1 available) 28983 1726882983.12212: exiting _queue_task() for managed_node2/service 28983 1726882983.12225: done queuing things up, now waiting for results queue to drain 28983 1726882983.12227: waiting for pending results... 28983 1726882983.12537: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 28983 1726882983.12767: in run() - task 0affe814-3a2d-b16d-c0a7-000000000217 28983 1726882983.12772: variable 'ansible_search_path' from source: unknown 28983 1726882983.12778: variable 'ansible_search_path' from source: unknown 28983 1726882983.12821: calling self._execute() 28983 1726882983.13001: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882983.13006: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882983.13009: variable 'omit' from source: magic vars 28983 1726882983.13427: variable 'ansible_distribution_major_version' from source: facts 28983 1726882983.13452: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882983.13626: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726882983.13906: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726882983.16701: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726882983.16806: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726882983.16856: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726882983.17006: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726882983.17010: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726882983.17055: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726882983.17099: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726882983.17149: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726882983.17210: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726882983.17246: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726882983.17315: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726882983.17363: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726882983.17404: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726882983.17470: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726882983.17552: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726882983.17559: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726882983.17600: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726882983.17660: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726882983.17706: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726882983.17728: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726882983.17969: variable 'network_connections' from source: include params 28983 1726882983.18039: variable 'interface' from source: play vars 28983 1726882983.18099: variable 'interface' from source: play vars 28983 1726882983.18198: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726882983.18449: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726882983.18505: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726882983.18556: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726882983.18640: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726882983.18662: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726882983.18700: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726882983.18739: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726882983.18789: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726882983.18877: variable '__network_team_connections_defined' from source: role '' defaults 28983 1726882983.19440: variable 'network_connections' from source: include params 28983 1726882983.19443: variable 'interface' from source: play vars 28983 1726882983.19446: variable 'interface' from source: play vars 28983 1726882983.19448: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 28983 1726882983.19451: when evaluation is False, skipping this task 28983 1726882983.19454: _execute() done 28983 1726882983.19456: dumping result to json 28983 1726882983.19458: done dumping result, returning 28983 1726882983.19460: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affe814-3a2d-b16d-c0a7-000000000217] 28983 1726882983.19462: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000217 28983 1726882983.19543: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000217 skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 28983 1726882983.19608: no more pending results, returning what we have 28983 1726882983.19612: results queue empty 28983 1726882983.19613: checking for any_errors_fatal 28983 1726882983.19623: done checking for any_errors_fatal 28983 1726882983.19624: checking for max_fail_percentage 28983 1726882983.19626: done checking for max_fail_percentage 28983 1726882983.19627: checking to see if all hosts have failed and the running result is not ok 28983 1726882983.19628: done checking to see if all hosts have failed 28983 1726882983.19629: getting the remaining hosts for this loop 28983 1726882983.19631: done getting the remaining hosts for this loop 28983 1726882983.19639: getting the next task for host managed_node2 28983 1726882983.19648: done getting next task for host managed_node2 28983 1726882983.19653: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 28983 1726882983.19661: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882983.19683: getting variables 28983 1726882983.19685: in VariableManager get_vars() 28983 1726882983.19729: Calling all_inventory to load vars for managed_node2 28983 1726882983.19733: Calling groups_inventory to load vars for managed_node2 28983 1726882983.19743: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882983.19754: Calling all_plugins_play to load vars for managed_node2 28983 1726882983.19758: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882983.19762: Calling groups_plugins_play to load vars for managed_node2 28983 1726882983.20651: WORKER PROCESS EXITING 28983 1726882983.22456: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882983.25725: done with get_vars() 28983 1726882983.25766: done getting variables 28983 1726882983.25840: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:43:03 -0400 (0:00:00.140) 0:00:13.256 ****** 28983 1726882983.25887: entering _queue_task() for managed_node2/service 28983 1726882983.26440: worker is 1 (out of 1 available) 28983 1726882983.26451: exiting _queue_task() for managed_node2/service 28983 1726882983.26462: done queuing things up, now waiting for results queue to drain 28983 1726882983.26464: waiting for pending results... 28983 1726882983.26602: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 28983 1726882983.26775: in run() - task 0affe814-3a2d-b16d-c0a7-000000000218 28983 1726882983.26805: variable 'ansible_search_path' from source: unknown 28983 1726882983.26814: variable 'ansible_search_path' from source: unknown 28983 1726882983.26866: calling self._execute() 28983 1726882983.26980: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882983.26996: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882983.27020: variable 'omit' from source: magic vars 28983 1726882983.27495: variable 'ansible_distribution_major_version' from source: facts 28983 1726882983.27515: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882983.27753: variable 'network_provider' from source: set_fact 28983 1726882983.27765: variable 'network_state' from source: role '' defaults 28983 1726882983.27793: Evaluated conditional (network_provider == "nm" or network_state != {}): True 28983 1726882983.27804: variable 'omit' from source: magic vars 28983 1726882983.27898: variable 'omit' from source: magic vars 28983 1726882983.27940: variable 'network_service_name' from source: role '' defaults 28983 1726882983.28042: variable 'network_service_name' from source: role '' defaults 28983 1726882983.28220: variable '__network_provider_setup' from source: role '' defaults 28983 1726882983.28224: variable '__network_service_name_default_nm' from source: role '' defaults 28983 1726882983.28298: variable '__network_service_name_default_nm' from source: role '' defaults 28983 1726882983.28313: variable '__network_packages_default_nm' from source: role '' defaults 28983 1726882983.28438: variable '__network_packages_default_nm' from source: role '' defaults 28983 1726882983.28731: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726882983.31472: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726882983.31559: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726882983.31616: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726882983.31666: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726882983.31712: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726882983.31819: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726882983.31864: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726882983.31913: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726882983.31976: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726882983.32001: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726882983.32079: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726882983.32235: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726882983.32241: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726882983.32244: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726882983.32255: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726882983.32624: variable '__network_packages_default_gobject_packages' from source: role '' defaults 28983 1726882983.32807: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726882983.32843: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726882983.32880: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726882983.32943: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726882983.32966: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726882983.33093: variable 'ansible_python' from source: facts 28983 1726882983.33127: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 28983 1726882983.33323: variable '__network_wpa_supplicant_required' from source: role '' defaults 28983 1726882983.33346: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 28983 1726882983.33508: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726882983.33544: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726882983.33588: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726882983.33641: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726882983.33741: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726882983.33744: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726882983.33784: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726882983.33817: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726882983.33869: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726882983.33901: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726882983.34080: variable 'network_connections' from source: include params 28983 1726882983.34103: variable 'interface' from source: play vars 28983 1726882983.34212: variable 'interface' from source: play vars 28983 1726882983.34359: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726882983.34621: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726882983.34742: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726882983.34753: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726882983.34810: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726882983.34898: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726882983.34945: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726882983.35001: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726882983.35049: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726882983.35119: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726882983.35524: variable 'network_connections' from source: include params 28983 1726882983.35739: variable 'interface' from source: play vars 28983 1726882983.35742: variable 'interface' from source: play vars 28983 1726882983.35744: variable '__network_packages_default_wireless' from source: role '' defaults 28983 1726882983.35799: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726882983.36226: variable 'network_connections' from source: include params 28983 1726882983.36242: variable 'interface' from source: play vars 28983 1726882983.36341: variable 'interface' from source: play vars 28983 1726882983.36378: variable '__network_packages_default_team' from source: role '' defaults 28983 1726882983.36486: variable '__network_team_connections_defined' from source: role '' defaults 28983 1726882983.36889: variable 'network_connections' from source: include params 28983 1726882983.36901: variable 'interface' from source: play vars 28983 1726882983.36996: variable 'interface' from source: play vars 28983 1726882983.37087: variable '__network_service_name_default_initscripts' from source: role '' defaults 28983 1726882983.37168: variable '__network_service_name_default_initscripts' from source: role '' defaults 28983 1726882983.37188: variable '__network_packages_default_initscripts' from source: role '' defaults 28983 1726882983.37265: variable '__network_packages_default_initscripts' from source: role '' defaults 28983 1726882983.37579: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 28983 1726882983.38325: variable 'network_connections' from source: include params 28983 1726882983.38341: variable 'interface' from source: play vars 28983 1726882983.38430: variable 'interface' from source: play vars 28983 1726882983.38452: variable 'ansible_distribution' from source: facts 28983 1726882983.38463: variable '__network_rh_distros' from source: role '' defaults 28983 1726882983.38492: variable 'ansible_distribution_major_version' from source: facts 28983 1726882983.38591: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 28983 1726882983.38778: variable 'ansible_distribution' from source: facts 28983 1726882983.38789: variable '__network_rh_distros' from source: role '' defaults 28983 1726882983.38800: variable 'ansible_distribution_major_version' from source: facts 28983 1726882983.38820: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 28983 1726882983.39057: variable 'ansible_distribution' from source: facts 28983 1726882983.39068: variable '__network_rh_distros' from source: role '' defaults 28983 1726882983.39082: variable 'ansible_distribution_major_version' from source: facts 28983 1726882983.39126: variable 'network_provider' from source: set_fact 28983 1726882983.39169: variable 'omit' from source: magic vars 28983 1726882983.39210: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726882983.39352: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726882983.39356: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726882983.39359: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882983.39362: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882983.39364: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726882983.39376: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882983.39386: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882983.39515: Set connection var ansible_connection to ssh 28983 1726882983.39533: Set connection var ansible_shell_executable to /bin/sh 28983 1726882983.39550: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726882983.39568: Set connection var ansible_timeout to 10 28983 1726882983.39582: Set connection var ansible_pipelining to False 28983 1726882983.39595: Set connection var ansible_shell_type to sh 28983 1726882983.39627: variable 'ansible_shell_executable' from source: unknown 28983 1726882983.39640: variable 'ansible_connection' from source: unknown 28983 1726882983.39682: variable 'ansible_module_compression' from source: unknown 28983 1726882983.39685: variable 'ansible_shell_type' from source: unknown 28983 1726882983.39688: variable 'ansible_shell_executable' from source: unknown 28983 1726882983.39690: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882983.39692: variable 'ansible_pipelining' from source: unknown 28983 1726882983.39697: variable 'ansible_timeout' from source: unknown 28983 1726882983.39703: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882983.39845: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726882983.39899: variable 'omit' from source: magic vars 28983 1726882983.39903: starting attempt loop 28983 1726882983.39905: running the handler 28983 1726882983.39994: variable 'ansible_facts' from source: unknown 28983 1726882983.41261: _low_level_execute_command(): starting 28983 1726882983.41318: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726882983.42064: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726882983.42084: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726882983.42104: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882983.42240: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726882983.42267: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882983.42388: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882983.44157: stdout chunk (state=3): >>>/root <<< 28983 1726882983.44271: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882983.44320: stderr chunk (state=3): >>><<< 28983 1726882983.44324: stdout chunk (state=3): >>><<< 28983 1726882983.44344: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726882983.44356: _low_level_execute_command(): starting 28983 1726882983.44364: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882983.4434521-29506-157132692128701 `" && echo ansible-tmp-1726882983.4434521-29506-157132692128701="` echo /root/.ansible/tmp/ansible-tmp-1726882983.4434521-29506-157132692128701 `" ) && sleep 0' 28983 1726882983.44778: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882983.44783: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882983.44797: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882983.44858: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726882983.44864: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882983.44938: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882983.46955: stdout chunk (state=3): >>>ansible-tmp-1726882983.4434521-29506-157132692128701=/root/.ansible/tmp/ansible-tmp-1726882983.4434521-29506-157132692128701 <<< 28983 1726882983.47071: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882983.47115: stderr chunk (state=3): >>><<< 28983 1726882983.47118: stdout chunk (state=3): >>><<< 28983 1726882983.47132: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882983.4434521-29506-157132692128701=/root/.ansible/tmp/ansible-tmp-1726882983.4434521-29506-157132692128701 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726882983.47159: variable 'ansible_module_compression' from source: unknown 28983 1726882983.47206: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 28983 1726882983.47210: ANSIBALLZ: Acquiring lock 28983 1726882983.47215: ANSIBALLZ: Lock acquired: 140284034522080 28983 1726882983.47217: ANSIBALLZ: Creating module 28983 1726882983.83641: ANSIBALLZ: Writing module into payload 28983 1726882983.83682: ANSIBALLZ: Writing module 28983 1726882983.83722: ANSIBALLZ: Renaming module 28983 1726882983.83733: ANSIBALLZ: Done creating module 28983 1726882983.83784: variable 'ansible_facts' from source: unknown 28983 1726882983.84266: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882983.4434521-29506-157132692128701/AnsiballZ_systemd.py 28983 1726882983.84680: Sending initial data 28983 1726882983.84692: Sent initial data (156 bytes) 28983 1726882983.85344: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726882983.85486: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882983.85555: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882983.87299: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726882983.87379: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726882983.87464: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpdenzcagu /root/.ansible/tmp/ansible-tmp-1726882983.4434521-29506-157132692128701/AnsiballZ_systemd.py <<< 28983 1726882983.87467: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882983.4434521-29506-157132692128701/AnsiballZ_systemd.py" <<< 28983 1726882983.87546: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpdenzcagu" to remote "/root/.ansible/tmp/ansible-tmp-1726882983.4434521-29506-157132692128701/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882983.4434521-29506-157132692128701/AnsiballZ_systemd.py" <<< 28983 1726882983.90115: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882983.90147: stderr chunk (state=3): >>><<< 28983 1726882983.90161: stdout chunk (state=3): >>><<< 28983 1726882983.90194: done transferring module to remote 28983 1726882983.90210: _low_level_execute_command(): starting 28983 1726882983.90221: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882983.4434521-29506-157132692128701/ /root/.ansible/tmp/ansible-tmp-1726882983.4434521-29506-157132692128701/AnsiballZ_systemd.py && sleep 0' 28983 1726882983.90890: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726882983.90906: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726882983.90921: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882983.90940: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726882983.90958: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726882983.91000: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882983.91084: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726882983.91109: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726882983.91130: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882983.91232: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882983.93254: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882983.93264: stdout chunk (state=3): >>><<< 28983 1726882983.93278: stderr chunk (state=3): >>><<< 28983 1726882983.93300: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726882983.93310: _low_level_execute_command(): starting 28983 1726882983.93320: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882983.4434521-29506-157132692128701/AnsiballZ_systemd.py && sleep 0' 28983 1726882983.93961: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726882983.93977: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726882983.93994: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882983.94012: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726882983.94031: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726882983.94054: stderr chunk (state=3): >>>debug2: match not found <<< 28983 1726882983.94154: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882983.94171: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726882983.94188: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726882983.94212: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882983.94331: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882984.27185: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3307", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ExecMainStartTimestampMonotonic": "237115079", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3307", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start<<< 28983 1726882984.27241: stdout chunk (state=3): >>>_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4837", "MemoryCurrent": "4440064", "MemoryAvailable": "infinity", "CPUUsageNSec": "1462464000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target NetworkManager-wait-online.service cloud-init.service network.target network.service multi-user.target", "After": "systemd-journald.socket dbus-broker.service dbus.socket system.slice cloud-init-local.service sysinit.target network-pre.target basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:40:53 EDT", "StateChangeTimestampMonotonic": "816797668", "InactiveExitTimestamp": "Fri 2024-09-20 21:31:13 EDT", "InactiveExitTimestampMonotonic": "237115438", "ActiveEnterTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ActiveEnterTimestampMonotonic": "237210316", "ActiveExitTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ActiveExitTimestampMonotonic": "237082173", "InactiveEnterTimestamp": "Fri 2024-09-20 21:31:13 EDT", "InactiveEnterTimestampMonotonic": "237109124", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ConditionTimestampMonotonic": "237109810", "AssertTimestamp": "Fri 2024-09-20 21:31:13 EDT", "AssertTimestampMonotonic": "237109813", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ac5f6302898c463683c4bdf580ab7e3e", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 28983 1726882984.29354: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. <<< 28983 1726882984.29477: stderr chunk (state=3): >>><<< 28983 1726882984.29481: stdout chunk (state=3): >>><<< 28983 1726882984.29502: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3307", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ExecMainStartTimestampMonotonic": "237115079", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3307", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4837", "MemoryCurrent": "4440064", "MemoryAvailable": "infinity", "CPUUsageNSec": "1462464000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target NetworkManager-wait-online.service cloud-init.service network.target network.service multi-user.target", "After": "systemd-journald.socket dbus-broker.service dbus.socket system.slice cloud-init-local.service sysinit.target network-pre.target basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:40:53 EDT", "StateChangeTimestampMonotonic": "816797668", "InactiveExitTimestamp": "Fri 2024-09-20 21:31:13 EDT", "InactiveExitTimestampMonotonic": "237115438", "ActiveEnterTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ActiveEnterTimestampMonotonic": "237210316", "ActiveExitTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ActiveExitTimestampMonotonic": "237082173", "InactiveEnterTimestamp": "Fri 2024-09-20 21:31:13 EDT", "InactiveEnterTimestampMonotonic": "237109124", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ConditionTimestampMonotonic": "237109810", "AssertTimestamp": "Fri 2024-09-20 21:31:13 EDT", "AssertTimestampMonotonic": "237109813", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ac5f6302898c463683c4bdf580ab7e3e", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726882984.29939: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882983.4434521-29506-157132692128701/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726882984.29943: _low_level_execute_command(): starting 28983 1726882984.29946: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882983.4434521-29506-157132692128701/ > /dev/null 2>&1 && sleep 0' 28983 1726882984.30545: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726882984.30562: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726882984.30701: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882984.30705: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726882984.30756: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882984.30837: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882984.32856: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882984.32880: stdout chunk (state=3): >>><<< 28983 1726882984.32884: stderr chunk (state=3): >>><<< 28983 1726882984.33039: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726882984.33043: handler run complete 28983 1726882984.33046: attempt loop complete, returning result 28983 1726882984.33048: _execute() done 28983 1726882984.33050: dumping result to json 28983 1726882984.33052: done dumping result, returning 28983 1726882984.33055: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affe814-3a2d-b16d-c0a7-000000000218] 28983 1726882984.33063: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000218 ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28983 1726882984.34467: no more pending results, returning what we have 28983 1726882984.34471: results queue empty 28983 1726882984.34472: checking for any_errors_fatal 28983 1726882984.34483: done checking for any_errors_fatal 28983 1726882984.34483: checking for max_fail_percentage 28983 1726882984.34486: done checking for max_fail_percentage 28983 1726882984.34486: checking to see if all hosts have failed and the running result is not ok 28983 1726882984.34487: done checking to see if all hosts have failed 28983 1726882984.34488: getting the remaining hosts for this loop 28983 1726882984.34490: done getting the remaining hosts for this loop 28983 1726882984.34495: getting the next task for host managed_node2 28983 1726882984.34502: done getting next task for host managed_node2 28983 1726882984.34507: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 28983 1726882984.34513: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882984.34524: getting variables 28983 1726882984.34526: in VariableManager get_vars() 28983 1726882984.34563: Calling all_inventory to load vars for managed_node2 28983 1726882984.34566: Calling groups_inventory to load vars for managed_node2 28983 1726882984.34568: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882984.34581: Calling all_plugins_play to load vars for managed_node2 28983 1726882984.34585: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882984.34589: Calling groups_plugins_play to load vars for managed_node2 28983 1726882984.35108: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000218 28983 1726882984.35111: WORKER PROCESS EXITING 28983 1726882984.36936: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882984.39770: done with get_vars() 28983 1726882984.39815: done getting variables 28983 1726882984.39890: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:43:04 -0400 (0:00:01.140) 0:00:14.397 ****** 28983 1726882984.39940: entering _queue_task() for managed_node2/service 28983 1726882984.40301: worker is 1 (out of 1 available) 28983 1726882984.40314: exiting _queue_task() for managed_node2/service 28983 1726882984.40328: done queuing things up, now waiting for results queue to drain 28983 1726882984.40330: waiting for pending results... 28983 1726882984.40638: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 28983 1726882984.40819: in run() - task 0affe814-3a2d-b16d-c0a7-000000000219 28983 1726882984.40844: variable 'ansible_search_path' from source: unknown 28983 1726882984.40853: variable 'ansible_search_path' from source: unknown 28983 1726882984.40903: calling self._execute() 28983 1726882984.41012: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882984.41081: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882984.41085: variable 'omit' from source: magic vars 28983 1726882984.41497: variable 'ansible_distribution_major_version' from source: facts 28983 1726882984.41520: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882984.41678: variable 'network_provider' from source: set_fact 28983 1726882984.41691: Evaluated conditional (network_provider == "nm"): True 28983 1726882984.41815: variable '__network_wpa_supplicant_required' from source: role '' defaults 28983 1726882984.41930: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 28983 1726882984.42169: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726882984.45070: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726882984.45132: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726882984.45186: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726882984.45288: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726882984.45291: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726882984.45366: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726882984.45412: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726882984.45451: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726882984.45512: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726882984.45536: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726882984.45601: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726882984.45645: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726882984.45720: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726882984.45741: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726882984.45763: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726882984.45825: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726882984.45865: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726882984.45900: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726882984.45960: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726882984.46052: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726882984.46178: variable 'network_connections' from source: include params 28983 1726882984.46198: variable 'interface' from source: play vars 28983 1726882984.46301: variable 'interface' from source: play vars 28983 1726882984.46423: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726882984.46648: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726882984.46700: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726882984.46745: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726882984.46786: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726882984.46923: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726882984.46926: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726882984.46929: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726882984.46956: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726882984.47014: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726882984.47359: variable 'network_connections' from source: include params 28983 1726882984.47376: variable 'interface' from source: play vars 28983 1726882984.47459: variable 'interface' from source: play vars 28983 1726882984.47517: Evaluated conditional (__network_wpa_supplicant_required): False 28983 1726882984.47526: when evaluation is False, skipping this task 28983 1726882984.47586: _execute() done 28983 1726882984.47589: dumping result to json 28983 1726882984.47592: done dumping result, returning 28983 1726882984.47594: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affe814-3a2d-b16d-c0a7-000000000219] 28983 1726882984.47606: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000219 skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 28983 1726882984.47749: no more pending results, returning what we have 28983 1726882984.47753: results queue empty 28983 1726882984.47754: checking for any_errors_fatal 28983 1726882984.47782: done checking for any_errors_fatal 28983 1726882984.47783: checking for max_fail_percentage 28983 1726882984.47786: done checking for max_fail_percentage 28983 1726882984.47787: checking to see if all hosts have failed and the running result is not ok 28983 1726882984.47788: done checking to see if all hosts have failed 28983 1726882984.47789: getting the remaining hosts for this loop 28983 1726882984.47792: done getting the remaining hosts for this loop 28983 1726882984.47797: getting the next task for host managed_node2 28983 1726882984.47807: done getting next task for host managed_node2 28983 1726882984.47812: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 28983 1726882984.47818: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882984.47838: getting variables 28983 1726882984.47840: in VariableManager get_vars() 28983 1726882984.47884: Calling all_inventory to load vars for managed_node2 28983 1726882984.47887: Calling groups_inventory to load vars for managed_node2 28983 1726882984.47890: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882984.47901: Calling all_plugins_play to load vars for managed_node2 28983 1726882984.47905: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882984.47909: Calling groups_plugins_play to load vars for managed_node2 28983 1726882984.48752: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000219 28983 1726882984.48756: WORKER PROCESS EXITING 28983 1726882984.50591: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882984.53550: done with get_vars() 28983 1726882984.53590: done getting variables 28983 1726882984.53665: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:43:04 -0400 (0:00:00.137) 0:00:14.534 ****** 28983 1726882984.53706: entering _queue_task() for managed_node2/service 28983 1726882984.54065: worker is 1 (out of 1 available) 28983 1726882984.54078: exiting _queue_task() for managed_node2/service 28983 1726882984.54092: done queuing things up, now waiting for results queue to drain 28983 1726882984.54094: waiting for pending results... 28983 1726882984.54401: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 28983 1726882984.54589: in run() - task 0affe814-3a2d-b16d-c0a7-00000000021a 28983 1726882984.54614: variable 'ansible_search_path' from source: unknown 28983 1726882984.54624: variable 'ansible_search_path' from source: unknown 28983 1726882984.54675: calling self._execute() 28983 1726882984.54783: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882984.54796: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882984.54813: variable 'omit' from source: magic vars 28983 1726882984.55256: variable 'ansible_distribution_major_version' from source: facts 28983 1726882984.55277: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882984.55439: variable 'network_provider' from source: set_fact 28983 1726882984.55451: Evaluated conditional (network_provider == "initscripts"): False 28983 1726882984.55460: when evaluation is False, skipping this task 28983 1726882984.55467: _execute() done 28983 1726882984.55474: dumping result to json 28983 1726882984.55483: done dumping result, returning 28983 1726882984.55495: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [0affe814-3a2d-b16d-c0a7-00000000021a] 28983 1726882984.55506: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000021a skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28983 1726882984.55786: no more pending results, returning what we have 28983 1726882984.55791: results queue empty 28983 1726882984.55792: checking for any_errors_fatal 28983 1726882984.55805: done checking for any_errors_fatal 28983 1726882984.55806: checking for max_fail_percentage 28983 1726882984.55809: done checking for max_fail_percentage 28983 1726882984.55810: checking to see if all hosts have failed and the running result is not ok 28983 1726882984.55811: done checking to see if all hosts have failed 28983 1726882984.55812: getting the remaining hosts for this loop 28983 1726882984.55814: done getting the remaining hosts for this loop 28983 1726882984.55820: getting the next task for host managed_node2 28983 1726882984.55828: done getting next task for host managed_node2 28983 1726882984.55833: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 28983 1726882984.55842: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882984.55862: getting variables 28983 1726882984.55864: in VariableManager get_vars() 28983 1726882984.55905: Calling all_inventory to load vars for managed_node2 28983 1726882984.55909: Calling groups_inventory to load vars for managed_node2 28983 1726882984.55912: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882984.55925: Calling all_plugins_play to load vars for managed_node2 28983 1726882984.55929: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882984.55933: Calling groups_plugins_play to load vars for managed_node2 28983 1726882984.56139: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000021a 28983 1726882984.56143: WORKER PROCESS EXITING 28983 1726882984.58374: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882984.61294: done with get_vars() 28983 1726882984.61332: done getting variables 28983 1726882984.61405: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:43:04 -0400 (0:00:00.077) 0:00:14.612 ****** 28983 1726882984.61450: entering _queue_task() for managed_node2/copy 28983 1726882984.61799: worker is 1 (out of 1 available) 28983 1726882984.61812: exiting _queue_task() for managed_node2/copy 28983 1726882984.61826: done queuing things up, now waiting for results queue to drain 28983 1726882984.61827: waiting for pending results... 28983 1726882984.62133: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 28983 1726882984.62310: in run() - task 0affe814-3a2d-b16d-c0a7-00000000021b 28983 1726882984.62331: variable 'ansible_search_path' from source: unknown 28983 1726882984.62342: variable 'ansible_search_path' from source: unknown 28983 1726882984.62390: calling self._execute() 28983 1726882984.62498: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882984.62511: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882984.62528: variable 'omit' from source: magic vars 28983 1726882984.62971: variable 'ansible_distribution_major_version' from source: facts 28983 1726882984.62992: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882984.63146: variable 'network_provider' from source: set_fact 28983 1726882984.63157: Evaluated conditional (network_provider == "initscripts"): False 28983 1726882984.63165: when evaluation is False, skipping this task 28983 1726882984.63173: _execute() done 28983 1726882984.63183: dumping result to json 28983 1726882984.63194: done dumping result, returning 28983 1726882984.63207: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affe814-3a2d-b16d-c0a7-00000000021b] 28983 1726882984.63239: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000021b 28983 1726882984.63485: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000021b 28983 1726882984.63488: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 28983 1726882984.63540: no more pending results, returning what we have 28983 1726882984.63545: results queue empty 28983 1726882984.63546: checking for any_errors_fatal 28983 1726882984.63553: done checking for any_errors_fatal 28983 1726882984.63554: checking for max_fail_percentage 28983 1726882984.63556: done checking for max_fail_percentage 28983 1726882984.63557: checking to see if all hosts have failed and the running result is not ok 28983 1726882984.63558: done checking to see if all hosts have failed 28983 1726882984.63559: getting the remaining hosts for this loop 28983 1726882984.63561: done getting the remaining hosts for this loop 28983 1726882984.63566: getting the next task for host managed_node2 28983 1726882984.63574: done getting next task for host managed_node2 28983 1726882984.63579: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 28983 1726882984.63585: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882984.63602: getting variables 28983 1726882984.63604: in VariableManager get_vars() 28983 1726882984.63642: Calling all_inventory to load vars for managed_node2 28983 1726882984.63645: Calling groups_inventory to load vars for managed_node2 28983 1726882984.63648: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882984.63659: Calling all_plugins_play to load vars for managed_node2 28983 1726882984.63663: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882984.63667: Calling groups_plugins_play to load vars for managed_node2 28983 1726882984.65919: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882984.68889: done with get_vars() 28983 1726882984.68924: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:43:04 -0400 (0:00:00.075) 0:00:14.687 ****** 28983 1726882984.69023: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 28983 1726882984.69025: Creating lock for fedora.linux_system_roles.network_connections 28983 1726882984.69347: worker is 1 (out of 1 available) 28983 1726882984.69361: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 28983 1726882984.69375: done queuing things up, now waiting for results queue to drain 28983 1726882984.69377: waiting for pending results... 28983 1726882984.69754: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 28983 1726882984.69847: in run() - task 0affe814-3a2d-b16d-c0a7-00000000021c 28983 1726882984.69871: variable 'ansible_search_path' from source: unknown 28983 1726882984.69879: variable 'ansible_search_path' from source: unknown 28983 1726882984.69924: calling self._execute() 28983 1726882984.70026: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882984.70044: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882984.70061: variable 'omit' from source: magic vars 28983 1726882984.70486: variable 'ansible_distribution_major_version' from source: facts 28983 1726882984.70509: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882984.70521: variable 'omit' from source: magic vars 28983 1726882984.70601: variable 'omit' from source: magic vars 28983 1726882984.70806: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726882984.73676: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726882984.73758: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726882984.73809: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726882984.73877: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726882984.73892: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726882984.73978: variable 'network_provider' from source: set_fact 28983 1726882984.74140: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726882984.74174: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726882984.74210: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726882984.74271: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726882984.74294: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726882984.74387: variable 'omit' from source: magic vars 28983 1726882984.74519: variable 'omit' from source: magic vars 28983 1726882984.74648: variable 'network_connections' from source: include params 28983 1726882984.74666: variable 'interface' from source: play vars 28983 1726882984.74753: variable 'interface' from source: play vars 28983 1726882984.74941: variable 'omit' from source: magic vars 28983 1726882984.74957: variable '__lsr_ansible_managed' from source: task vars 28983 1726882984.75032: variable '__lsr_ansible_managed' from source: task vars 28983 1726882984.75265: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 28983 1726882984.75536: Loaded config def from plugin (lookup/template) 28983 1726882984.75548: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 28983 1726882984.75588: File lookup term: get_ansible_managed.j2 28983 1726882984.75598: variable 'ansible_search_path' from source: unknown 28983 1726882984.75609: evaluation_path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 28983 1726882984.75632: search_path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 28983 1726882984.75661: variable 'ansible_search_path' from source: unknown 28983 1726882984.91817: variable 'ansible_managed' from source: unknown 28983 1726882984.92379: variable 'omit' from source: magic vars 28983 1726882984.92414: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726882984.92469: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726882984.92582: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726882984.92606: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882984.92687: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882984.92702: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726882984.92706: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882984.92712: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882984.92946: Set connection var ansible_connection to ssh 28983 1726882984.92959: Set connection var ansible_shell_executable to /bin/sh 28983 1726882984.92970: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726882984.92984: Set connection var ansible_timeout to 10 28983 1726882984.92991: Set connection var ansible_pipelining to False 28983 1726882984.92994: Set connection var ansible_shell_type to sh 28983 1726882984.93159: variable 'ansible_shell_executable' from source: unknown 28983 1726882984.93163: variable 'ansible_connection' from source: unknown 28983 1726882984.93166: variable 'ansible_module_compression' from source: unknown 28983 1726882984.93170: variable 'ansible_shell_type' from source: unknown 28983 1726882984.93173: variable 'ansible_shell_executable' from source: unknown 28983 1726882984.93230: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882984.93233: variable 'ansible_pipelining' from source: unknown 28983 1726882984.93239: variable 'ansible_timeout' from source: unknown 28983 1726882984.93241: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882984.93493: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28983 1726882984.93503: variable 'omit' from source: magic vars 28983 1726882984.93512: starting attempt loop 28983 1726882984.93515: running the handler 28983 1726882984.93530: _low_level_execute_command(): starting 28983 1726882984.93878: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726882984.95228: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726882984.95337: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882984.95360: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726882984.95374: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726882984.95397: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882984.95565: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882984.97360: stdout chunk (state=3): >>>/root <<< 28983 1726882984.97545: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882984.97549: stdout chunk (state=3): >>><<< 28983 1726882984.97552: stderr chunk (state=3): >>><<< 28983 1726882984.97573: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726882984.97590: _low_level_execute_command(): starting 28983 1726882984.97597: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882984.9757638-29569-247990325508061 `" && echo ansible-tmp-1726882984.9757638-29569-247990325508061="` echo /root/.ansible/tmp/ansible-tmp-1726882984.9757638-29569-247990325508061 `" ) && sleep 0' 28983 1726882984.98199: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726882984.98240: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726882984.98244: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882984.98246: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726882984.98249: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726882984.98252: stderr chunk (state=3): >>>debug2: match not found <<< 28983 1726882984.98311: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882984.98315: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28983 1726882984.98317: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.46.139 is address <<< 28983 1726882984.98320: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28983 1726882984.98322: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726882984.98325: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882984.98327: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726882984.98330: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726882984.98340: stderr chunk (state=3): >>>debug2: match found <<< 28983 1726882984.98351: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882984.98426: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726882984.98479: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726882984.98483: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882984.98564: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882985.00565: stdout chunk (state=3): >>>ansible-tmp-1726882984.9757638-29569-247990325508061=/root/.ansible/tmp/ansible-tmp-1726882984.9757638-29569-247990325508061 <<< 28983 1726882985.00760: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882985.00764: stdout chunk (state=3): >>><<< 28983 1726882985.00766: stderr chunk (state=3): >>><<< 28983 1726882985.00786: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882984.9757638-29569-247990325508061=/root/.ansible/tmp/ansible-tmp-1726882984.9757638-29569-247990325508061 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726882985.00942: variable 'ansible_module_compression' from source: unknown 28983 1726882985.00946: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 28983 1726882985.00948: ANSIBALLZ: Acquiring lock 28983 1726882985.00950: ANSIBALLZ: Lock acquired: 140284033219904 28983 1726882985.00952: ANSIBALLZ: Creating module 28983 1726882985.38436: ANSIBALLZ: Writing module into payload 28983 1726882985.38989: ANSIBALLZ: Writing module 28983 1726882985.39040: ANSIBALLZ: Renaming module 28983 1726882985.39053: ANSIBALLZ: Done creating module 28983 1726882985.39140: variable 'ansible_facts' from source: unknown 28983 1726882985.39217: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882984.9757638-29569-247990325508061/AnsiballZ_network_connections.py 28983 1726882985.39437: Sending initial data 28983 1726882985.39449: Sent initial data (168 bytes) 28983 1726882985.40232: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726882985.40254: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726882985.40271: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882985.40312: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726882985.40411: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882985.40441: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726882985.40460: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726882985.40485: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882985.40644: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882985.42374: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 28983 1726882985.42389: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 28983 1726882985.42401: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 28983 1726882985.42410: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 <<< 28983 1726882985.42426: stderr chunk (state=3): >>>debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726882985.42525: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726882985.42592: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpfnebsso8 /root/.ansible/tmp/ansible-tmp-1726882984.9757638-29569-247990325508061/AnsiballZ_network_connections.py <<< 28983 1726882985.42617: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882984.9757638-29569-247990325508061/AnsiballZ_network_connections.py" <<< 28983 1726882985.42691: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpfnebsso8" to remote "/root/.ansible/tmp/ansible-tmp-1726882984.9757638-29569-247990325508061/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882984.9757638-29569-247990325508061/AnsiballZ_network_connections.py" <<< 28983 1726882985.45017: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882985.45199: stderr chunk (state=3): >>><<< 28983 1726882985.45203: stdout chunk (state=3): >>><<< 28983 1726882985.45205: done transferring module to remote 28983 1726882985.45208: _low_level_execute_command(): starting 28983 1726882985.45210: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882984.9757638-29569-247990325508061/ /root/.ansible/tmp/ansible-tmp-1726882984.9757638-29569-247990325508061/AnsiballZ_network_connections.py && sleep 0' 28983 1726882985.45781: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882985.45814: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726882985.45817: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28983 1726882985.45820: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882985.45881: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726882985.45884: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882985.45960: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882985.47929: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882985.47976: stderr chunk (state=3): >>><<< 28983 1726882985.47980: stdout chunk (state=3): >>><<< 28983 1726882985.48000: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726882985.48004: _low_level_execute_command(): starting 28983 1726882985.48032: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882984.9757638-29569-247990325508061/AnsiballZ_network_connections.py && sleep 0' 28983 1726882985.48808: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726882985.48812: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726882985.48815: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882985.48821: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726882985.48824: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726882985.48827: stderr chunk (state=3): >>>debug2: match not found <<< 28983 1726882985.48924: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726882985.49053: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882985.49170: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882985.81752: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, d5d673f8-3c8b-4cfe-b951-473f5117625f\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 28983 1726882985.83163: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882985.83261: stderr chunk (state=3): >>>Shared connection to 10.31.46.139 closed. <<< 28983 1726882985.83286: stderr chunk (state=3): >>><<< 28983 1726882985.83297: stdout chunk (state=3): >>><<< 28983 1726882985.83323: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, d5d673f8-3c8b-4cfe-b951-473f5117625f\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726882985.83400: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'persistent_state': 'present', 'type': 'bridge', 'ip': {'dhcp4': False, 'auto6': False}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882984.9757638-29569-247990325508061/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726882985.83417: _low_level_execute_command(): starting 28983 1726882985.83427: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882984.9757638-29569-247990325508061/ > /dev/null 2>&1 && sleep 0' 28983 1726882985.83986: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726882985.83989: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726882985.83992: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address <<< 28983 1726882985.83994: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726882985.83997: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882985.84056: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726882985.84059: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882985.84160: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882985.87003: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882985.87007: stdout chunk (state=3): >>><<< 28983 1726882985.87009: stderr chunk (state=3): >>><<< 28983 1726882985.87012: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726882985.87021: handler run complete 28983 1726882985.87023: attempt loop complete, returning result 28983 1726882985.87025: _execute() done 28983 1726882985.87027: dumping result to json 28983 1726882985.87029: done dumping result, returning 28983 1726882985.87032: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affe814-3a2d-b16d-c0a7-00000000021c] 28983 1726882985.87035: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000021c changed: [managed_node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [002] #0, state:None persistent_state:present, 'statebr': add connection statebr, d5d673f8-3c8b-4cfe-b951-473f5117625f 28983 1726882985.87263: no more pending results, returning what we have 28983 1726882985.87267: results queue empty 28983 1726882985.87268: checking for any_errors_fatal 28983 1726882985.87278: done checking for any_errors_fatal 28983 1726882985.87279: checking for max_fail_percentage 28983 1726882985.87282: done checking for max_fail_percentage 28983 1726882985.87283: checking to see if all hosts have failed and the running result is not ok 28983 1726882985.87284: done checking to see if all hosts have failed 28983 1726882985.87285: getting the remaining hosts for this loop 28983 1726882985.87287: done getting the remaining hosts for this loop 28983 1726882985.87292: getting the next task for host managed_node2 28983 1726882985.87300: done getting next task for host managed_node2 28983 1726882985.87304: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 28983 1726882985.87309: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882985.87326: getting variables 28983 1726882985.87328: in VariableManager get_vars() 28983 1726882985.87472: Calling all_inventory to load vars for managed_node2 28983 1726882985.87475: Calling groups_inventory to load vars for managed_node2 28983 1726882985.87478: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882985.87489: Calling all_plugins_play to load vars for managed_node2 28983 1726882985.87492: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882985.87496: Calling groups_plugins_play to load vars for managed_node2 28983 1726882985.87508: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000021c 28983 1726882985.87511: WORKER PROCESS EXITING 28983 1726882985.90252: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882985.94212: done with get_vars() 28983 1726882985.94253: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:43:05 -0400 (0:00:01.253) 0:00:15.941 ****** 28983 1726882985.94367: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 28983 1726882985.94369: Creating lock for fedora.linux_system_roles.network_state 28983 1726882985.94763: worker is 1 (out of 1 available) 28983 1726882985.94777: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 28983 1726882985.94790: done queuing things up, now waiting for results queue to drain 28983 1726882985.94792: waiting for pending results... 28983 1726882985.95105: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 28983 1726882985.95302: in run() - task 0affe814-3a2d-b16d-c0a7-00000000021d 28983 1726882985.95328: variable 'ansible_search_path' from source: unknown 28983 1726882985.95342: variable 'ansible_search_path' from source: unknown 28983 1726882985.95399: calling self._execute() 28983 1726882985.95621: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882985.95640: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882985.95663: variable 'omit' from source: magic vars 28983 1726882985.96141: variable 'ansible_distribution_major_version' from source: facts 28983 1726882985.96167: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882985.96343: variable 'network_state' from source: role '' defaults 28983 1726882985.96371: Evaluated conditional (network_state != {}): False 28983 1726882985.96375: when evaluation is False, skipping this task 28983 1726882985.96405: _execute() done 28983 1726882985.96408: dumping result to json 28983 1726882985.96411: done dumping result, returning 28983 1726882985.96414: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [0affe814-3a2d-b16d-c0a7-00000000021d] 28983 1726882985.96423: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000021d skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28983 1726882985.96699: no more pending results, returning what we have 28983 1726882985.96704: results queue empty 28983 1726882985.96705: checking for any_errors_fatal 28983 1726882985.96719: done checking for any_errors_fatal 28983 1726882985.96720: checking for max_fail_percentage 28983 1726882985.96723: done checking for max_fail_percentage 28983 1726882985.96936: checking to see if all hosts have failed and the running result is not ok 28983 1726882985.96938: done checking to see if all hosts have failed 28983 1726882985.96939: getting the remaining hosts for this loop 28983 1726882985.96941: done getting the remaining hosts for this loop 28983 1726882985.96946: getting the next task for host managed_node2 28983 1726882985.96953: done getting next task for host managed_node2 28983 1726882985.96957: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 28983 1726882985.96962: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882985.96974: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000021d 28983 1726882985.96977: WORKER PROCESS EXITING 28983 1726882985.96987: getting variables 28983 1726882985.96989: in VariableManager get_vars() 28983 1726882985.97023: Calling all_inventory to load vars for managed_node2 28983 1726882985.97026: Calling groups_inventory to load vars for managed_node2 28983 1726882985.97029: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882985.97046: Calling all_plugins_play to load vars for managed_node2 28983 1726882985.97049: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882985.97054: Calling groups_plugins_play to load vars for managed_node2 28983 1726882986.00236: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882986.04324: done with get_vars() 28983 1726882986.04370: done getting variables 28983 1726882986.04561: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:43:06 -0400 (0:00:00.102) 0:00:16.043 ****** 28983 1726882986.04603: entering _queue_task() for managed_node2/debug 28983 1726882986.05382: worker is 1 (out of 1 available) 28983 1726882986.05396: exiting _queue_task() for managed_node2/debug 28983 1726882986.05410: done queuing things up, now waiting for results queue to drain 28983 1726882986.05412: waiting for pending results... 28983 1726882986.05670: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 28983 1726882986.05873: in run() - task 0affe814-3a2d-b16d-c0a7-00000000021e 28983 1726882986.05878: variable 'ansible_search_path' from source: unknown 28983 1726882986.05882: variable 'ansible_search_path' from source: unknown 28983 1726882986.05885: calling self._execute() 28983 1726882986.05975: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882986.05983: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882986.05995: variable 'omit' from source: magic vars 28983 1726882986.06422: variable 'ansible_distribution_major_version' from source: facts 28983 1726882986.06528: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882986.06532: variable 'omit' from source: magic vars 28983 1726882986.06536: variable 'omit' from source: magic vars 28983 1726882986.06570: variable 'omit' from source: magic vars 28983 1726882986.06619: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726882986.06660: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726882986.06685: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726882986.06706: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882986.06719: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882986.06757: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726882986.06761: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882986.06766: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882986.06889: Set connection var ansible_connection to ssh 28983 1726882986.06903: Set connection var ansible_shell_executable to /bin/sh 28983 1726882986.06914: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726882986.06938: Set connection var ansible_timeout to 10 28983 1726882986.06942: Set connection var ansible_pipelining to False 28983 1726882986.06944: Set connection var ansible_shell_type to sh 28983 1726882986.06965: variable 'ansible_shell_executable' from source: unknown 28983 1726882986.06968: variable 'ansible_connection' from source: unknown 28983 1726882986.06971: variable 'ansible_module_compression' from source: unknown 28983 1726882986.07041: variable 'ansible_shell_type' from source: unknown 28983 1726882986.07044: variable 'ansible_shell_executable' from source: unknown 28983 1726882986.07047: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882986.07050: variable 'ansible_pipelining' from source: unknown 28983 1726882986.07052: variable 'ansible_timeout' from source: unknown 28983 1726882986.07055: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882986.07157: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726882986.07170: variable 'omit' from source: magic vars 28983 1726882986.07182: starting attempt loop 28983 1726882986.07185: running the handler 28983 1726882986.07336: variable '__network_connections_result' from source: set_fact 28983 1726882986.07439: handler run complete 28983 1726882986.07443: attempt loop complete, returning result 28983 1726882986.07447: _execute() done 28983 1726882986.07450: dumping result to json 28983 1726882986.07452: done dumping result, returning 28983 1726882986.07465: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affe814-3a2d-b16d-c0a7-00000000021e] 28983 1726882986.07514: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000021e 28983 1726882986.07591: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000021e 28983 1726882986.07594: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, d5d673f8-3c8b-4cfe-b951-473f5117625f" ] } 28983 1726882986.07682: no more pending results, returning what we have 28983 1726882986.07687: results queue empty 28983 1726882986.07688: checking for any_errors_fatal 28983 1726882986.07696: done checking for any_errors_fatal 28983 1726882986.07697: checking for max_fail_percentage 28983 1726882986.07698: done checking for max_fail_percentage 28983 1726882986.07700: checking to see if all hosts have failed and the running result is not ok 28983 1726882986.07701: done checking to see if all hosts have failed 28983 1726882986.07701: getting the remaining hosts for this loop 28983 1726882986.07703: done getting the remaining hosts for this loop 28983 1726882986.07708: getting the next task for host managed_node2 28983 1726882986.07715: done getting next task for host managed_node2 28983 1726882986.07720: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 28983 1726882986.07725: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882986.07746: getting variables 28983 1726882986.07749: in VariableManager get_vars() 28983 1726882986.07789: Calling all_inventory to load vars for managed_node2 28983 1726882986.07792: Calling groups_inventory to load vars for managed_node2 28983 1726882986.07795: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882986.07804: Calling all_plugins_play to load vars for managed_node2 28983 1726882986.07807: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882986.07810: Calling groups_plugins_play to load vars for managed_node2 28983 1726882986.11017: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882986.19525: done with get_vars() 28983 1726882986.19876: done getting variables 28983 1726882986.20040: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:43:06 -0400 (0:00:00.154) 0:00:16.198 ****** 28983 1726882986.20094: entering _queue_task() for managed_node2/debug 28983 1726882986.20865: worker is 1 (out of 1 available) 28983 1726882986.20883: exiting _queue_task() for managed_node2/debug 28983 1726882986.20899: done queuing things up, now waiting for results queue to drain 28983 1726882986.20901: waiting for pending results... 28983 1726882986.21956: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 28983 1726882986.22342: in run() - task 0affe814-3a2d-b16d-c0a7-00000000021f 28983 1726882986.22346: variable 'ansible_search_path' from source: unknown 28983 1726882986.22349: variable 'ansible_search_path' from source: unknown 28983 1726882986.22352: calling self._execute() 28983 1726882986.22709: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882986.22741: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882986.22759: variable 'omit' from source: magic vars 28983 1726882986.23695: variable 'ansible_distribution_major_version' from source: facts 28983 1726882986.23720: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882986.24034: variable 'omit' from source: magic vars 28983 1726882986.24039: variable 'omit' from source: magic vars 28983 1726882986.24077: variable 'omit' from source: magic vars 28983 1726882986.24133: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726882986.24296: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726882986.24327: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726882986.24387: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882986.24453: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882986.24500: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726882986.24510: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882986.24519: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882986.24655: Set connection var ansible_connection to ssh 28983 1726882986.24674: Set connection var ansible_shell_executable to /bin/sh 28983 1726882986.24697: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726882986.24713: Set connection var ansible_timeout to 10 28983 1726882986.24725: Set connection var ansible_pipelining to False 28983 1726882986.24732: Set connection var ansible_shell_type to sh 28983 1726882986.24772: variable 'ansible_shell_executable' from source: unknown 28983 1726882986.24782: variable 'ansible_connection' from source: unknown 28983 1726882986.24795: variable 'ansible_module_compression' from source: unknown 28983 1726882986.24803: variable 'ansible_shell_type' from source: unknown 28983 1726882986.24811: variable 'ansible_shell_executable' from source: unknown 28983 1726882986.24818: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882986.24860: variable 'ansible_pipelining' from source: unknown 28983 1726882986.24864: variable 'ansible_timeout' from source: unknown 28983 1726882986.24867: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882986.25061: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726882986.25115: variable 'omit' from source: magic vars 28983 1726882986.25123: starting attempt loop 28983 1726882986.25126: running the handler 28983 1726882986.25241: variable '__network_connections_result' from source: set_fact 28983 1726882986.25308: variable '__network_connections_result' from source: set_fact 28983 1726882986.25492: handler run complete 28983 1726882986.25550: attempt loop complete, returning result 28983 1726882986.25573: _execute() done 28983 1726882986.25577: dumping result to json 28983 1726882986.25641: done dumping result, returning 28983 1726882986.25648: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affe814-3a2d-b16d-c0a7-00000000021f] 28983 1726882986.25668: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000021f ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, d5d673f8-3c8b-4cfe-b951-473f5117625f\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, d5d673f8-3c8b-4cfe-b951-473f5117625f" ] } } 28983 1726882986.25965: no more pending results, returning what we have 28983 1726882986.25969: results queue empty 28983 1726882986.25970: checking for any_errors_fatal 28983 1726882986.25981: done checking for any_errors_fatal 28983 1726882986.25982: checking for max_fail_percentage 28983 1726882986.25984: done checking for max_fail_percentage 28983 1726882986.25985: checking to see if all hosts have failed and the running result is not ok 28983 1726882986.25986: done checking to see if all hosts have failed 28983 1726882986.25987: getting the remaining hosts for this loop 28983 1726882986.25989: done getting the remaining hosts for this loop 28983 1726882986.25994: getting the next task for host managed_node2 28983 1726882986.26003: done getting next task for host managed_node2 28983 1726882986.26008: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 28983 1726882986.26013: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882986.26025: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000021f 28983 1726882986.26029: WORKER PROCESS EXITING 28983 1726882986.26181: getting variables 28983 1726882986.26183: in VariableManager get_vars() 28983 1726882986.26226: Calling all_inventory to load vars for managed_node2 28983 1726882986.26229: Calling groups_inventory to load vars for managed_node2 28983 1726882986.26232: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882986.26243: Calling all_plugins_play to load vars for managed_node2 28983 1726882986.26247: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882986.26251: Calling groups_plugins_play to load vars for managed_node2 28983 1726882986.31012: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882986.37091: done with get_vars() 28983 1726882986.37128: done getting variables 28983 1726882986.37201: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:43:06 -0400 (0:00:00.171) 0:00:16.370 ****** 28983 1726882986.37446: entering _queue_task() for managed_node2/debug 28983 1726882986.38219: worker is 1 (out of 1 available) 28983 1726882986.38236: exiting _queue_task() for managed_node2/debug 28983 1726882986.38251: done queuing things up, now waiting for results queue to drain 28983 1726882986.38253: waiting for pending results... 28983 1726882986.38859: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 28983 1726882986.39305: in run() - task 0affe814-3a2d-b16d-c0a7-000000000220 28983 1726882986.39309: variable 'ansible_search_path' from source: unknown 28983 1726882986.39312: variable 'ansible_search_path' from source: unknown 28983 1726882986.39314: calling self._execute() 28983 1726882986.39469: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882986.39532: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882986.39553: variable 'omit' from source: magic vars 28983 1726882986.40525: variable 'ansible_distribution_major_version' from source: facts 28983 1726882986.40606: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882986.40813: variable 'network_state' from source: role '' defaults 28983 1726882986.41040: Evaluated conditional (network_state != {}): False 28983 1726882986.41045: when evaluation is False, skipping this task 28983 1726882986.41049: _execute() done 28983 1726882986.41051: dumping result to json 28983 1726882986.41054: done dumping result, returning 28983 1726882986.41057: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affe814-3a2d-b16d-c0a7-000000000220] 28983 1726882986.41060: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000220 28983 1726882986.41135: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000220 28983 1726882986.41139: WORKER PROCESS EXITING skipping: [managed_node2] => { "false_condition": "network_state != {}" } 28983 1726882986.41201: no more pending results, returning what we have 28983 1726882986.41205: results queue empty 28983 1726882986.41206: checking for any_errors_fatal 28983 1726882986.41218: done checking for any_errors_fatal 28983 1726882986.41219: checking for max_fail_percentage 28983 1726882986.41222: done checking for max_fail_percentage 28983 1726882986.41223: checking to see if all hosts have failed and the running result is not ok 28983 1726882986.41224: done checking to see if all hosts have failed 28983 1726882986.41225: getting the remaining hosts for this loop 28983 1726882986.41227: done getting the remaining hosts for this loop 28983 1726882986.41232: getting the next task for host managed_node2 28983 1726882986.41243: done getting next task for host managed_node2 28983 1726882986.41248: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 28983 1726882986.41255: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882986.41276: getting variables 28983 1726882986.41278: in VariableManager get_vars() 28983 1726882986.41318: Calling all_inventory to load vars for managed_node2 28983 1726882986.41321: Calling groups_inventory to load vars for managed_node2 28983 1726882986.41324: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882986.41638: Calling all_plugins_play to load vars for managed_node2 28983 1726882986.41647: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882986.41652: Calling groups_plugins_play to load vars for managed_node2 28983 1726882986.46066: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882986.50286: done with get_vars() 28983 1726882986.50326: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:43:06 -0400 (0:00:00.130) 0:00:16.502 ****** 28983 1726882986.50453: entering _queue_task() for managed_node2/ping 28983 1726882986.50456: Creating lock for ping 28983 1726882986.50836: worker is 1 (out of 1 available) 28983 1726882986.50852: exiting _queue_task() for managed_node2/ping 28983 1726882986.50866: done queuing things up, now waiting for results queue to drain 28983 1726882986.50868: waiting for pending results... 28983 1726882986.51406: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 28983 1726882986.51760: in run() - task 0affe814-3a2d-b16d-c0a7-000000000221 28983 1726882986.51853: variable 'ansible_search_path' from source: unknown 28983 1726882986.51975: variable 'ansible_search_path' from source: unknown 28983 1726882986.51979: calling self._execute() 28983 1726882986.52196: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882986.52213: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882986.52231: variable 'omit' from source: magic vars 28983 1726882986.53257: variable 'ansible_distribution_major_version' from source: facts 28983 1726882986.53405: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882986.53408: variable 'omit' from source: magic vars 28983 1726882986.53522: variable 'omit' from source: magic vars 28983 1726882986.53570: variable 'omit' from source: magic vars 28983 1726882986.53623: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726882986.53666: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726882986.53694: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726882986.53720: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882986.53731: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882986.53772: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726882986.53780: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882986.53784: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882986.54043: Set connection var ansible_connection to ssh 28983 1726882986.54047: Set connection var ansible_shell_executable to /bin/sh 28983 1726882986.54050: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726882986.54052: Set connection var ansible_timeout to 10 28983 1726882986.54055: Set connection var ansible_pipelining to False 28983 1726882986.54057: Set connection var ansible_shell_type to sh 28983 1726882986.54191: variable 'ansible_shell_executable' from source: unknown 28983 1726882986.54197: variable 'ansible_connection' from source: unknown 28983 1726882986.54200: variable 'ansible_module_compression' from source: unknown 28983 1726882986.54205: variable 'ansible_shell_type' from source: unknown 28983 1726882986.54208: variable 'ansible_shell_executable' from source: unknown 28983 1726882986.54214: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882986.54219: variable 'ansible_pipelining' from source: unknown 28983 1726882986.54222: variable 'ansible_timeout' from source: unknown 28983 1726882986.54229: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882986.54696: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28983 1726882986.54703: variable 'omit' from source: magic vars 28983 1726882986.54710: starting attempt loop 28983 1726882986.54713: running the handler 28983 1726882986.54729: _low_level_execute_command(): starting 28983 1726882986.54741: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726882986.55866: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882986.55920: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726882986.55923: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726882986.55939: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882986.56042: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882986.57886: stdout chunk (state=3): >>>/root <<< 28983 1726882986.58073: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882986.58086: stderr chunk (state=3): >>><<< 28983 1726882986.58100: stdout chunk (state=3): >>><<< 28983 1726882986.58127: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726882986.58155: _low_level_execute_command(): starting 28983 1726882986.58203: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882986.581414-29639-48417838527212 `" && echo ansible-tmp-1726882986.581414-29639-48417838527212="` echo /root/.ansible/tmp/ansible-tmp-1726882986.581414-29639-48417838527212 `" ) && sleep 0' 28983 1726882986.58967: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726882986.59040: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882986.59044: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726882986.59046: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882986.59048: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882986.59062: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882986.59104: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726882986.59111: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726882986.59119: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882986.59303: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882986.61318: stdout chunk (state=3): >>>ansible-tmp-1726882986.581414-29639-48417838527212=/root/.ansible/tmp/ansible-tmp-1726882986.581414-29639-48417838527212 <<< 28983 1726882986.61640: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882986.61643: stdout chunk (state=3): >>><<< 28983 1726882986.61646: stderr chunk (state=3): >>><<< 28983 1726882986.61648: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882986.581414-29639-48417838527212=/root/.ansible/tmp/ansible-tmp-1726882986.581414-29639-48417838527212 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726882986.61651: variable 'ansible_module_compression' from source: unknown 28983 1726882986.61653: ANSIBALLZ: Using lock for ping 28983 1726882986.61655: ANSIBALLZ: Acquiring lock 28983 1726882986.61657: ANSIBALLZ: Lock acquired: 140284028755152 28983 1726882986.61659: ANSIBALLZ: Creating module 28983 1726882986.79743: ANSIBALLZ: Writing module into payload 28983 1726882986.79779: ANSIBALLZ: Writing module 28983 1726882986.79808: ANSIBALLZ: Renaming module 28983 1726882986.79851: ANSIBALLZ: Done creating module 28983 1726882986.79885: variable 'ansible_facts' from source: unknown 28983 1726882986.79976: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882986.581414-29639-48417838527212/AnsiballZ_ping.py 28983 1726882986.80179: Sending initial data 28983 1726882986.80191: Sent initial data (151 bytes) 28983 1726882986.80624: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726882986.80632: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726882986.80642: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726882986.80650: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882986.80675: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28983 1726882986.80678: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882986.80729: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726882986.80750: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882986.80826: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882986.82569: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726882986.82865: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726882986.82943: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmp9ls3tzv1 /root/.ansible/tmp/ansible-tmp-1726882986.581414-29639-48417838527212/AnsiballZ_ping.py <<< 28983 1726882986.82946: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882986.581414-29639-48417838527212/AnsiballZ_ping.py" <<< 28983 1726882986.82997: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmp9ls3tzv1" to remote "/root/.ansible/tmp/ansible-tmp-1726882986.581414-29639-48417838527212/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882986.581414-29639-48417838527212/AnsiballZ_ping.py" <<< 28983 1726882986.84197: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882986.84299: stderr chunk (state=3): >>><<< 28983 1726882986.84317: stdout chunk (state=3): >>><<< 28983 1726882986.84351: done transferring module to remote 28983 1726882986.84369: _low_level_execute_command(): starting 28983 1726882986.84389: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882986.581414-29639-48417838527212/ /root/.ansible/tmp/ansible-tmp-1726882986.581414-29639-48417838527212/AnsiballZ_ping.py && sleep 0' 28983 1726882986.85177: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726882986.85247: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882986.85320: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726882986.85345: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726882986.85382: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882986.85659: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882986.87365: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882986.87423: stderr chunk (state=3): >>><<< 28983 1726882986.87437: stdout chunk (state=3): >>><<< 28983 1726882986.87454: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726882986.87457: _low_level_execute_command(): starting 28983 1726882986.87463: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882986.581414-29639-48417838527212/AnsiballZ_ping.py && sleep 0' 28983 1726882986.87907: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882986.87911: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726882986.87914: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882986.87917: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882986.87919: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882986.88039: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726882986.88043: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882986.88144: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882987.05032: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 28983 1726882987.06378: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. <<< 28983 1726882987.06442: stderr chunk (state=3): >>><<< 28983 1726882987.06446: stdout chunk (state=3): >>><<< 28983 1726882987.06462: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726882987.06488: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882986.581414-29639-48417838527212/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726882987.06498: _low_level_execute_command(): starting 28983 1726882987.06504: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882986.581414-29639-48417838527212/ > /dev/null 2>&1 && sleep 0' 28983 1726882987.07015: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882987.07023: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882987.07027: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28983 1726882987.07029: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882987.07093: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726882987.07101: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882987.07185: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882987.09093: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882987.09139: stderr chunk (state=3): >>><<< 28983 1726882987.09142: stdout chunk (state=3): >>><<< 28983 1726882987.09159: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726882987.09166: handler run complete 28983 1726882987.09188: attempt loop complete, returning result 28983 1726882987.09191: _execute() done 28983 1726882987.09194: dumping result to json 28983 1726882987.09199: done dumping result, returning 28983 1726882987.09208: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affe814-3a2d-b16d-c0a7-000000000221] 28983 1726882987.09213: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000221 28983 1726882987.09311: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000221 28983 1726882987.09314: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "ping": "pong" } 28983 1726882987.09380: no more pending results, returning what we have 28983 1726882987.09383: results queue empty 28983 1726882987.09384: checking for any_errors_fatal 28983 1726882987.09391: done checking for any_errors_fatal 28983 1726882987.09392: checking for max_fail_percentage 28983 1726882987.09394: done checking for max_fail_percentage 28983 1726882987.09395: checking to see if all hosts have failed and the running result is not ok 28983 1726882987.09396: done checking to see if all hosts have failed 28983 1726882987.09397: getting the remaining hosts for this loop 28983 1726882987.09399: done getting the remaining hosts for this loop 28983 1726882987.09404: getting the next task for host managed_node2 28983 1726882987.09416: done getting next task for host managed_node2 28983 1726882987.09418: ^ task is: TASK: meta (role_complete) 28983 1726882987.09424: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882987.09439: getting variables 28983 1726882987.09441: in VariableManager get_vars() 28983 1726882987.09483: Calling all_inventory to load vars for managed_node2 28983 1726882987.09486: Calling groups_inventory to load vars for managed_node2 28983 1726882987.09489: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882987.09499: Calling all_plugins_play to load vars for managed_node2 28983 1726882987.09502: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882987.09506: Calling groups_plugins_play to load vars for managed_node2 28983 1726882987.10860: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882987.12870: done with get_vars() 28983 1726882987.12900: done getting variables 28983 1726882987.12968: done queuing things up, now waiting for results queue to drain 28983 1726882987.12970: results queue empty 28983 1726882987.12971: checking for any_errors_fatal 28983 1726882987.12975: done checking for any_errors_fatal 28983 1726882987.12975: checking for max_fail_percentage 28983 1726882987.12976: done checking for max_fail_percentage 28983 1726882987.12977: checking to see if all hosts have failed and the running result is not ok 28983 1726882987.12978: done checking to see if all hosts have failed 28983 1726882987.12978: getting the remaining hosts for this loop 28983 1726882987.12979: done getting the remaining hosts for this loop 28983 1726882987.12981: getting the next task for host managed_node2 28983 1726882987.12984: done getting next task for host managed_node2 28983 1726882987.12986: ^ task is: TASK: Show result 28983 1726882987.12988: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882987.12990: getting variables 28983 1726882987.12991: in VariableManager get_vars() 28983 1726882987.13000: Calling all_inventory to load vars for managed_node2 28983 1726882987.13002: Calling groups_inventory to load vars for managed_node2 28983 1726882987.13004: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882987.13009: Calling all_plugins_play to load vars for managed_node2 28983 1726882987.13011: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882987.13013: Calling groups_plugins_play to load vars for managed_node2 28983 1726882987.14157: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882987.15741: done with get_vars() 28983 1726882987.15760: done getting variables 28983 1726882987.15797: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show result] ************************************************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml:14 Friday 20 September 2024 21:43:07 -0400 (0:00:00.653) 0:00:17.156 ****** 28983 1726882987.15825: entering _queue_task() for managed_node2/debug 28983 1726882987.16122: worker is 1 (out of 1 available) 28983 1726882987.16139: exiting _queue_task() for managed_node2/debug 28983 1726882987.16152: done queuing things up, now waiting for results queue to drain 28983 1726882987.16154: waiting for pending results... 28983 1726882987.16346: running TaskExecutor() for managed_node2/TASK: Show result 28983 1726882987.16425: in run() - task 0affe814-3a2d-b16d-c0a7-00000000018f 28983 1726882987.16440: variable 'ansible_search_path' from source: unknown 28983 1726882987.16443: variable 'ansible_search_path' from source: unknown 28983 1726882987.16479: calling self._execute() 28983 1726882987.16553: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882987.16559: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882987.16570: variable 'omit' from source: magic vars 28983 1726882987.16892: variable 'ansible_distribution_major_version' from source: facts 28983 1726882987.16902: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882987.16909: variable 'omit' from source: magic vars 28983 1726882987.16954: variable 'omit' from source: magic vars 28983 1726882987.16982: variable 'omit' from source: magic vars 28983 1726882987.17018: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726882987.17054: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726882987.17075: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726882987.17091: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882987.17101: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882987.17128: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726882987.17132: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882987.17142: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882987.17221: Set connection var ansible_connection to ssh 28983 1726882987.17231: Set connection var ansible_shell_executable to /bin/sh 28983 1726882987.17241: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726882987.17252: Set connection var ansible_timeout to 10 28983 1726882987.17260: Set connection var ansible_pipelining to False 28983 1726882987.17262: Set connection var ansible_shell_type to sh 28983 1726882987.17284: variable 'ansible_shell_executable' from source: unknown 28983 1726882987.17287: variable 'ansible_connection' from source: unknown 28983 1726882987.17290: variable 'ansible_module_compression' from source: unknown 28983 1726882987.17293: variable 'ansible_shell_type' from source: unknown 28983 1726882987.17298: variable 'ansible_shell_executable' from source: unknown 28983 1726882987.17301: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882987.17306: variable 'ansible_pipelining' from source: unknown 28983 1726882987.17309: variable 'ansible_timeout' from source: unknown 28983 1726882987.17314: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882987.17429: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726882987.17441: variable 'omit' from source: magic vars 28983 1726882987.17447: starting attempt loop 28983 1726882987.17452: running the handler 28983 1726882987.17496: variable '__network_connections_result' from source: set_fact 28983 1726882987.17559: variable '__network_connections_result' from source: set_fact 28983 1726882987.17657: handler run complete 28983 1726882987.17685: attempt loop complete, returning result 28983 1726882987.17688: _execute() done 28983 1726882987.17691: dumping result to json 28983 1726882987.17701: done dumping result, returning 28983 1726882987.17705: done running TaskExecutor() for managed_node2/TASK: Show result [0affe814-3a2d-b16d-c0a7-00000000018f] 28983 1726882987.17710: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000018f 28983 1726882987.17807: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000018f ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, d5d673f8-3c8b-4cfe-b951-473f5117625f\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, d5d673f8-3c8b-4cfe-b951-473f5117625f" ] } } 28983 1726882987.17902: no more pending results, returning what we have 28983 1726882987.17905: results queue empty 28983 1726882987.17907: checking for any_errors_fatal 28983 1726882987.17909: done checking for any_errors_fatal 28983 1726882987.17915: checking for max_fail_percentage 28983 1726882987.17917: done checking for max_fail_percentage 28983 1726882987.17918: checking to see if all hosts have failed and the running result is not ok 28983 1726882987.17919: done checking to see if all hosts have failed 28983 1726882987.17920: getting the remaining hosts for this loop 28983 1726882987.17926: done getting the remaining hosts for this loop 28983 1726882987.17929: getting the next task for host managed_node2 28983 1726882987.17939: done getting next task for host managed_node2 28983 1726882987.17942: ^ task is: TASK: Asserts 28983 1726882987.17946: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882987.17950: getting variables 28983 1726882987.17951: in VariableManager get_vars() 28983 1726882987.17980: Calling all_inventory to load vars for managed_node2 28983 1726882987.17983: Calling groups_inventory to load vars for managed_node2 28983 1726882987.17986: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882987.17996: Calling all_plugins_play to load vars for managed_node2 28983 1726882987.17999: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882987.18003: Calling groups_plugins_play to load vars for managed_node2 28983 1726882987.18522: WORKER PROCESS EXITING 28983 1726882987.22605: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882987.25198: done with get_vars() 28983 1726882987.25240: done getting variables TASK [Asserts] ***************************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:36 Friday 20 September 2024 21:43:07 -0400 (0:00:00.095) 0:00:17.251 ****** 28983 1726882987.25356: entering _queue_task() for managed_node2/include_tasks 28983 1726882987.25738: worker is 1 (out of 1 available) 28983 1726882987.25752: exiting _queue_task() for managed_node2/include_tasks 28983 1726882987.25765: done queuing things up, now waiting for results queue to drain 28983 1726882987.25767: waiting for pending results... 28983 1726882987.26158: running TaskExecutor() for managed_node2/TASK: Asserts 28983 1726882987.26230: in run() - task 0affe814-3a2d-b16d-c0a7-000000000096 28983 1726882987.26256: variable 'ansible_search_path' from source: unknown 28983 1726882987.26266: variable 'ansible_search_path' from source: unknown 28983 1726882987.26330: variable 'lsr_assert' from source: include params 28983 1726882987.26586: variable 'lsr_assert' from source: include params 28983 1726882987.26712: variable 'omit' from source: magic vars 28983 1726882987.26839: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882987.26862: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882987.26883: variable 'omit' from source: magic vars 28983 1726882987.27197: variable 'ansible_distribution_major_version' from source: facts 28983 1726882987.27218: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882987.27253: variable 'item' from source: unknown 28983 1726882987.27324: variable 'item' from source: unknown 28983 1726882987.27441: variable 'item' from source: unknown 28983 1726882987.27461: variable 'item' from source: unknown 28983 1726882987.27847: dumping result to json 28983 1726882987.27852: done dumping result, returning 28983 1726882987.27854: done running TaskExecutor() for managed_node2/TASK: Asserts [0affe814-3a2d-b16d-c0a7-000000000096] 28983 1726882987.27856: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000096 28983 1726882987.27910: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000096 28983 1726882987.27917: WORKER PROCESS EXITING 28983 1726882987.27945: no more pending results, returning what we have 28983 1726882987.27949: in VariableManager get_vars() 28983 1726882987.27982: Calling all_inventory to load vars for managed_node2 28983 1726882987.27985: Calling groups_inventory to load vars for managed_node2 28983 1726882987.27987: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882987.27994: Calling all_plugins_play to load vars for managed_node2 28983 1726882987.27997: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882987.27999: Calling groups_plugins_play to load vars for managed_node2 28983 1726882987.29193: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882987.30788: done with get_vars() 28983 1726882987.30808: variable 'ansible_search_path' from source: unknown 28983 1726882987.30809: variable 'ansible_search_path' from source: unknown 28983 1726882987.30843: we have included files to process 28983 1726882987.30844: generating all_blocks data 28983 1726882987.30847: done generating all_blocks data 28983 1726882987.30851: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 28983 1726882987.30852: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 28983 1726882987.30853: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 28983 1726882987.31011: in VariableManager get_vars() 28983 1726882987.31026: done with get_vars() 28983 1726882987.31238: done processing included file 28983 1726882987.31240: iterating over new_blocks loaded from include file 28983 1726882987.31241: in VariableManager get_vars() 28983 1726882987.31252: done with get_vars() 28983 1726882987.31253: filtering new block on tags 28983 1726882987.31301: done filtering new block on tags 28983 1726882987.31303: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node2 => (item=tasks/assert_profile_present.yml) 28983 1726882987.31307: extending task lists for all hosts with included blocks 28983 1726882987.32169: done extending task lists 28983 1726882987.32170: done processing included files 28983 1726882987.32171: results queue empty 28983 1726882987.32171: checking for any_errors_fatal 28983 1726882987.32177: done checking for any_errors_fatal 28983 1726882987.32177: checking for max_fail_percentage 28983 1726882987.32178: done checking for max_fail_percentage 28983 1726882987.32179: checking to see if all hosts have failed and the running result is not ok 28983 1726882987.32179: done checking to see if all hosts have failed 28983 1726882987.32180: getting the remaining hosts for this loop 28983 1726882987.32181: done getting the remaining hosts for this loop 28983 1726882987.32183: getting the next task for host managed_node2 28983 1726882987.32186: done getting next task for host managed_node2 28983 1726882987.32188: ^ task is: TASK: Include the task 'get_profile_stat.yml' 28983 1726882987.32190: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882987.32192: getting variables 28983 1726882987.32193: in VariableManager get_vars() 28983 1726882987.32199: Calling all_inventory to load vars for managed_node2 28983 1726882987.32201: Calling groups_inventory to load vars for managed_node2 28983 1726882987.32203: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882987.32207: Calling all_plugins_play to load vars for managed_node2 28983 1726882987.32209: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882987.32211: Calling groups_plugins_play to load vars for managed_node2 28983 1726882987.33304: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882987.34878: done with get_vars() 28983 1726882987.34900: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Friday 20 September 2024 21:43:07 -0400 (0:00:00.096) 0:00:17.347 ****** 28983 1726882987.34961: entering _queue_task() for managed_node2/include_tasks 28983 1726882987.35220: worker is 1 (out of 1 available) 28983 1726882987.35230: exiting _queue_task() for managed_node2/include_tasks 28983 1726882987.35244: done queuing things up, now waiting for results queue to drain 28983 1726882987.35245: waiting for pending results... 28983 1726882987.35432: running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' 28983 1726882987.35532: in run() - task 0affe814-3a2d-b16d-c0a7-000000000383 28983 1726882987.35546: variable 'ansible_search_path' from source: unknown 28983 1726882987.35551: variable 'ansible_search_path' from source: unknown 28983 1726882987.35585: calling self._execute() 28983 1726882987.35660: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882987.35666: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882987.35678: variable 'omit' from source: magic vars 28983 1726882987.35998: variable 'ansible_distribution_major_version' from source: facts 28983 1726882987.36009: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882987.36015: _execute() done 28983 1726882987.36019: dumping result to json 28983 1726882987.36031: done dumping result, returning 28983 1726882987.36034: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' [0affe814-3a2d-b16d-c0a7-000000000383] 28983 1726882987.36038: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000383 28983 1726882987.36158: no more pending results, returning what we have 28983 1726882987.36164: in VariableManager get_vars() 28983 1726882987.36204: Calling all_inventory to load vars for managed_node2 28983 1726882987.36207: Calling groups_inventory to load vars for managed_node2 28983 1726882987.36211: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882987.36223: Calling all_plugins_play to load vars for managed_node2 28983 1726882987.36227: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882987.36231: Calling groups_plugins_play to load vars for managed_node2 28983 1726882987.36250: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000383 28983 1726882987.36253: WORKER PROCESS EXITING 28983 1726882987.37562: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882987.39145: done with get_vars() 28983 1726882987.39164: variable 'ansible_search_path' from source: unknown 28983 1726882987.39165: variable 'ansible_search_path' from source: unknown 28983 1726882987.39172: variable 'item' from source: include params 28983 1726882987.39257: variable 'item' from source: include params 28983 1726882987.39289: we have included files to process 28983 1726882987.39291: generating all_blocks data 28983 1726882987.39292: done generating all_blocks data 28983 1726882987.39293: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 28983 1726882987.39294: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 28983 1726882987.39296: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 28983 1726882987.40146: done processing included file 28983 1726882987.40148: iterating over new_blocks loaded from include file 28983 1726882987.40149: in VariableManager get_vars() 28983 1726882987.40163: done with get_vars() 28983 1726882987.40164: filtering new block on tags 28983 1726882987.40267: done filtering new block on tags 28983 1726882987.40270: in VariableManager get_vars() 28983 1726882987.40284: done with get_vars() 28983 1726882987.40286: filtering new block on tags 28983 1726882987.40328: done filtering new block on tags 28983 1726882987.40330: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node2 28983 1726882987.40335: extending task lists for all hosts with included blocks 28983 1726882987.40546: done extending task lists 28983 1726882987.40547: done processing included files 28983 1726882987.40547: results queue empty 28983 1726882987.40548: checking for any_errors_fatal 28983 1726882987.40550: done checking for any_errors_fatal 28983 1726882987.40551: checking for max_fail_percentage 28983 1726882987.40551: done checking for max_fail_percentage 28983 1726882987.40552: checking to see if all hosts have failed and the running result is not ok 28983 1726882987.40553: done checking to see if all hosts have failed 28983 1726882987.40553: getting the remaining hosts for this loop 28983 1726882987.40554: done getting the remaining hosts for this loop 28983 1726882987.40556: getting the next task for host managed_node2 28983 1726882987.40559: done getting next task for host managed_node2 28983 1726882987.40561: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 28983 1726882987.40563: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882987.40565: getting variables 28983 1726882987.40566: in VariableManager get_vars() 28983 1726882987.40574: Calling all_inventory to load vars for managed_node2 28983 1726882987.40576: Calling groups_inventory to load vars for managed_node2 28983 1726882987.40578: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882987.40582: Calling all_plugins_play to load vars for managed_node2 28983 1726882987.40583: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882987.40585: Calling groups_plugins_play to load vars for managed_node2 28983 1726882987.41641: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882987.43286: done with get_vars() 28983 1726882987.43307: done getting variables 28983 1726882987.43341: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 21:43:07 -0400 (0:00:00.084) 0:00:17.431 ****** 28983 1726882987.43364: entering _queue_task() for managed_node2/set_fact 28983 1726882987.43589: worker is 1 (out of 1 available) 28983 1726882987.43602: exiting _queue_task() for managed_node2/set_fact 28983 1726882987.43615: done queuing things up, now waiting for results queue to drain 28983 1726882987.43617: waiting for pending results... 28983 1726882987.43812: running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag 28983 1726882987.43912: in run() - task 0affe814-3a2d-b16d-c0a7-0000000003fe 28983 1726882987.43924: variable 'ansible_search_path' from source: unknown 28983 1726882987.43928: variable 'ansible_search_path' from source: unknown 28983 1726882987.43963: calling self._execute() 28983 1726882987.44048: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882987.44055: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882987.44066: variable 'omit' from source: magic vars 28983 1726882987.44384: variable 'ansible_distribution_major_version' from source: facts 28983 1726882987.44396: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882987.44400: variable 'omit' from source: magic vars 28983 1726882987.44446: variable 'omit' from source: magic vars 28983 1726882987.44472: variable 'omit' from source: magic vars 28983 1726882987.44513: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726882987.44548: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726882987.44565: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726882987.44585: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882987.44594: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882987.44624: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726882987.44628: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882987.44632: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882987.44715: Set connection var ansible_connection to ssh 28983 1726882987.44724: Set connection var ansible_shell_executable to /bin/sh 28983 1726882987.44735: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726882987.44745: Set connection var ansible_timeout to 10 28983 1726882987.44754: Set connection var ansible_pipelining to False 28983 1726882987.44757: Set connection var ansible_shell_type to sh 28983 1726882987.44774: variable 'ansible_shell_executable' from source: unknown 28983 1726882987.44780: variable 'ansible_connection' from source: unknown 28983 1726882987.44783: variable 'ansible_module_compression' from source: unknown 28983 1726882987.44787: variable 'ansible_shell_type' from source: unknown 28983 1726882987.44790: variable 'ansible_shell_executable' from source: unknown 28983 1726882987.44795: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882987.44800: variable 'ansible_pipelining' from source: unknown 28983 1726882987.44802: variable 'ansible_timeout' from source: unknown 28983 1726882987.44808: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882987.44926: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726882987.44938: variable 'omit' from source: magic vars 28983 1726882987.44946: starting attempt loop 28983 1726882987.44949: running the handler 28983 1726882987.44961: handler run complete 28983 1726882987.44972: attempt loop complete, returning result 28983 1726882987.44978: _execute() done 28983 1726882987.44981: dumping result to json 28983 1726882987.44986: done dumping result, returning 28983 1726882987.44993: done running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag [0affe814-3a2d-b16d-c0a7-0000000003fe] 28983 1726882987.44998: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000003fe 28983 1726882987.45085: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000003fe 28983 1726882987.45088: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 28983 1726882987.45149: no more pending results, returning what we have 28983 1726882987.45152: results queue empty 28983 1726882987.45153: checking for any_errors_fatal 28983 1726882987.45155: done checking for any_errors_fatal 28983 1726882987.45156: checking for max_fail_percentage 28983 1726882987.45157: done checking for max_fail_percentage 28983 1726882987.45158: checking to see if all hosts have failed and the running result is not ok 28983 1726882987.45159: done checking to see if all hosts have failed 28983 1726882987.45160: getting the remaining hosts for this loop 28983 1726882987.45162: done getting the remaining hosts for this loop 28983 1726882987.45167: getting the next task for host managed_node2 28983 1726882987.45175: done getting next task for host managed_node2 28983 1726882987.45178: ^ task is: TASK: Stat profile file 28983 1726882987.45183: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882987.45187: getting variables 28983 1726882987.45189: in VariableManager get_vars() 28983 1726882987.45216: Calling all_inventory to load vars for managed_node2 28983 1726882987.45219: Calling groups_inventory to load vars for managed_node2 28983 1726882987.45222: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882987.45231: Calling all_plugins_play to load vars for managed_node2 28983 1726882987.45242: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882987.45246: Calling groups_plugins_play to load vars for managed_node2 28983 1726882987.46449: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882987.48044: done with get_vars() 28983 1726882987.48066: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 21:43:07 -0400 (0:00:00.047) 0:00:17.479 ****** 28983 1726882987.48137: entering _queue_task() for managed_node2/stat 28983 1726882987.48344: worker is 1 (out of 1 available) 28983 1726882987.48356: exiting _queue_task() for managed_node2/stat 28983 1726882987.48368: done queuing things up, now waiting for results queue to drain 28983 1726882987.48370: waiting for pending results... 28983 1726882987.48548: running TaskExecutor() for managed_node2/TASK: Stat profile file 28983 1726882987.48636: in run() - task 0affe814-3a2d-b16d-c0a7-0000000003ff 28983 1726882987.48649: variable 'ansible_search_path' from source: unknown 28983 1726882987.48652: variable 'ansible_search_path' from source: unknown 28983 1726882987.48682: calling self._execute() 28983 1726882987.48761: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882987.48765: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882987.48779: variable 'omit' from source: magic vars 28983 1726882987.49086: variable 'ansible_distribution_major_version' from source: facts 28983 1726882987.49097: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882987.49104: variable 'omit' from source: magic vars 28983 1726882987.49156: variable 'omit' from source: magic vars 28983 1726882987.49231: variable 'profile' from source: play vars 28983 1726882987.49237: variable 'interface' from source: play vars 28983 1726882987.49301: variable 'interface' from source: play vars 28983 1726882987.49317: variable 'omit' from source: magic vars 28983 1726882987.49355: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726882987.49390: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726882987.49407: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726882987.49422: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882987.49432: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882987.49461: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726882987.49464: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882987.49471: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882987.49550: Set connection var ansible_connection to ssh 28983 1726882987.49560: Set connection var ansible_shell_executable to /bin/sh 28983 1726882987.49569: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726882987.49580: Set connection var ansible_timeout to 10 28983 1726882987.49587: Set connection var ansible_pipelining to False 28983 1726882987.49590: Set connection var ansible_shell_type to sh 28983 1726882987.49610: variable 'ansible_shell_executable' from source: unknown 28983 1726882987.49613: variable 'ansible_connection' from source: unknown 28983 1726882987.49616: variable 'ansible_module_compression' from source: unknown 28983 1726882987.49618: variable 'ansible_shell_type' from source: unknown 28983 1726882987.49622: variable 'ansible_shell_executable' from source: unknown 28983 1726882987.49626: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882987.49631: variable 'ansible_pipelining' from source: unknown 28983 1726882987.49635: variable 'ansible_timeout' from source: unknown 28983 1726882987.49641: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882987.49799: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28983 1726882987.49810: variable 'omit' from source: magic vars 28983 1726882987.49819: starting attempt loop 28983 1726882987.49823: running the handler 28983 1726882987.49831: _low_level_execute_command(): starting 28983 1726882987.49842: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726882987.50383: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882987.50388: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28983 1726882987.50391: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882987.50439: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726882987.50444: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726882987.50464: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882987.50530: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882987.52310: stdout chunk (state=3): >>>/root <<< 28983 1726882987.52429: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882987.52474: stderr chunk (state=3): >>><<< 28983 1726882987.52481: stdout chunk (state=3): >>><<< 28983 1726882987.52504: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726882987.52516: _low_level_execute_command(): starting 28983 1726882987.52521: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882987.525044-29685-200820402152491 `" && echo ansible-tmp-1726882987.525044-29685-200820402152491="` echo /root/.ansible/tmp/ansible-tmp-1726882987.525044-29685-200820402152491 `" ) && sleep 0' 28983 1726882987.52985: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882987.52988: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726882987.52991: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration <<< 28983 1726882987.53001: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882987.53040: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726882987.53056: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882987.53124: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882987.55139: stdout chunk (state=3): >>>ansible-tmp-1726882987.525044-29685-200820402152491=/root/.ansible/tmp/ansible-tmp-1726882987.525044-29685-200820402152491 <<< 28983 1726882987.55254: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882987.55305: stderr chunk (state=3): >>><<< 28983 1726882987.55308: stdout chunk (state=3): >>><<< 28983 1726882987.55325: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882987.525044-29685-200820402152491=/root/.ansible/tmp/ansible-tmp-1726882987.525044-29685-200820402152491 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726882987.55370: variable 'ansible_module_compression' from source: unknown 28983 1726882987.55420: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 28983 1726882987.55463: variable 'ansible_facts' from source: unknown 28983 1726882987.55523: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882987.525044-29685-200820402152491/AnsiballZ_stat.py 28983 1726882987.55640: Sending initial data 28983 1726882987.55643: Sent initial data (152 bytes) 28983 1726882987.56125: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726882987.56129: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726882987.56131: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882987.56135: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882987.56139: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882987.56183: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726882987.56186: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882987.56261: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882987.57891: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726882987.57961: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726882987.58027: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpvvf9pngt /root/.ansible/tmp/ansible-tmp-1726882987.525044-29685-200820402152491/AnsiballZ_stat.py <<< 28983 1726882987.58030: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882987.525044-29685-200820402152491/AnsiballZ_stat.py" <<< 28983 1726882987.58092: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpvvf9pngt" to remote "/root/.ansible/tmp/ansible-tmp-1726882987.525044-29685-200820402152491/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882987.525044-29685-200820402152491/AnsiballZ_stat.py" <<< 28983 1726882987.58986: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882987.59047: stderr chunk (state=3): >>><<< 28983 1726882987.59051: stdout chunk (state=3): >>><<< 28983 1726882987.59070: done transferring module to remote 28983 1726882987.59080: _low_level_execute_command(): starting 28983 1726882987.59085: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882987.525044-29685-200820402152491/ /root/.ansible/tmp/ansible-tmp-1726882987.525044-29685-200820402152491/AnsiballZ_stat.py && sleep 0' 28983 1726882987.59510: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726882987.59546: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726882987.59549: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726882987.59552: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882987.59554: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882987.59557: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882987.59618: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726882987.59621: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882987.59691: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882987.61557: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882987.61601: stderr chunk (state=3): >>><<< 28983 1726882987.61605: stdout chunk (state=3): >>><<< 28983 1726882987.61618: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726882987.61622: _low_level_execute_command(): starting 28983 1726882987.61630: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882987.525044-29685-200820402152491/AnsiballZ_stat.py && sleep 0' 28983 1726882987.62036: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882987.62072: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726882987.62078: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882987.62082: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration <<< 28983 1726882987.62084: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882987.62131: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726882987.62138: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882987.62213: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882987.79569: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 28983 1726882987.80988: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. <<< 28983 1726882987.81052: stderr chunk (state=3): >>><<< 28983 1726882987.81057: stdout chunk (state=3): >>><<< 28983 1726882987.81075: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726882987.81103: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882987.525044-29685-200820402152491/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726882987.81114: _low_level_execute_command(): starting 28983 1726882987.81119: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882987.525044-29685-200820402152491/ > /dev/null 2>&1 && sleep 0' 28983 1726882987.81617: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726882987.81621: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726882987.81624: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882987.81626: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882987.81628: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882987.81669: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726882987.81685: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882987.81761: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882987.83737: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882987.83783: stderr chunk (state=3): >>><<< 28983 1726882987.83786: stdout chunk (state=3): >>><<< 28983 1726882987.83801: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726882987.83806: handler run complete 28983 1726882987.83827: attempt loop complete, returning result 28983 1726882987.83830: _execute() done 28983 1726882987.83836: dumping result to json 28983 1726882987.83841: done dumping result, returning 28983 1726882987.83849: done running TaskExecutor() for managed_node2/TASK: Stat profile file [0affe814-3a2d-b16d-c0a7-0000000003ff] 28983 1726882987.83855: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000003ff ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 28983 1726882987.84030: no more pending results, returning what we have 28983 1726882987.84036: results queue empty 28983 1726882987.84037: checking for any_errors_fatal 28983 1726882987.84044: done checking for any_errors_fatal 28983 1726882987.84045: checking for max_fail_percentage 28983 1726882987.84047: done checking for max_fail_percentage 28983 1726882987.84048: checking to see if all hosts have failed and the running result is not ok 28983 1726882987.84049: done checking to see if all hosts have failed 28983 1726882987.84050: getting the remaining hosts for this loop 28983 1726882987.84053: done getting the remaining hosts for this loop 28983 1726882987.84058: getting the next task for host managed_node2 28983 1726882987.84065: done getting next task for host managed_node2 28983 1726882987.84069: ^ task is: TASK: Set NM profile exist flag based on the profile files 28983 1726882987.84076: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882987.84080: getting variables 28983 1726882987.84083: in VariableManager get_vars() 28983 1726882987.84116: Calling all_inventory to load vars for managed_node2 28983 1726882987.84119: Calling groups_inventory to load vars for managed_node2 28983 1726882987.84122: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882987.84133: Calling all_plugins_play to load vars for managed_node2 28983 1726882987.84145: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882987.84151: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000003ff 28983 1726882987.84154: WORKER PROCESS EXITING 28983 1726882987.84159: Calling groups_plugins_play to load vars for managed_node2 28983 1726882987.85582: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882987.87158: done with get_vars() 28983 1726882987.87181: done getting variables 28983 1726882987.87230: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 21:43:07 -0400 (0:00:00.391) 0:00:17.870 ****** 28983 1726882987.87259: entering _queue_task() for managed_node2/set_fact 28983 1726882987.87488: worker is 1 (out of 1 available) 28983 1726882987.87501: exiting _queue_task() for managed_node2/set_fact 28983 1726882987.87514: done queuing things up, now waiting for results queue to drain 28983 1726882987.87515: waiting for pending results... 28983 1726882987.87703: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files 28983 1726882987.87804: in run() - task 0affe814-3a2d-b16d-c0a7-000000000400 28983 1726882987.87819: variable 'ansible_search_path' from source: unknown 28983 1726882987.87823: variable 'ansible_search_path' from source: unknown 28983 1726882987.87864: calling self._execute() 28983 1726882987.87946: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882987.87955: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882987.87968: variable 'omit' from source: magic vars 28983 1726882987.88282: variable 'ansible_distribution_major_version' from source: facts 28983 1726882987.88294: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882987.88403: variable 'profile_stat' from source: set_fact 28983 1726882987.88409: Evaluated conditional (profile_stat.stat.exists): False 28983 1726882987.88412: when evaluation is False, skipping this task 28983 1726882987.88415: _execute() done 28983 1726882987.88422: dumping result to json 28983 1726882987.88424: done dumping result, returning 28983 1726882987.88431: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files [0affe814-3a2d-b16d-c0a7-000000000400] 28983 1726882987.88438: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000400 28983 1726882987.88530: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000400 28983 1726882987.88534: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 28983 1726882987.88587: no more pending results, returning what we have 28983 1726882987.88591: results queue empty 28983 1726882987.88592: checking for any_errors_fatal 28983 1726882987.88599: done checking for any_errors_fatal 28983 1726882987.88600: checking for max_fail_percentage 28983 1726882987.88602: done checking for max_fail_percentage 28983 1726882987.88603: checking to see if all hosts have failed and the running result is not ok 28983 1726882987.88604: done checking to see if all hosts have failed 28983 1726882987.88605: getting the remaining hosts for this loop 28983 1726882987.88606: done getting the remaining hosts for this loop 28983 1726882987.88610: getting the next task for host managed_node2 28983 1726882987.88616: done getting next task for host managed_node2 28983 1726882987.88619: ^ task is: TASK: Get NM profile info 28983 1726882987.88624: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882987.88628: getting variables 28983 1726882987.88629: in VariableManager get_vars() 28983 1726882987.88658: Calling all_inventory to load vars for managed_node2 28983 1726882987.88661: Calling groups_inventory to load vars for managed_node2 28983 1726882987.88664: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882987.88673: Calling all_plugins_play to load vars for managed_node2 28983 1726882987.88677: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882987.88681: Calling groups_plugins_play to load vars for managed_node2 28983 1726882987.89868: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882987.91553: done with get_vars() 28983 1726882987.91573: done getting variables 28983 1726882987.91648: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 21:43:07 -0400 (0:00:00.044) 0:00:17.914 ****** 28983 1726882987.91672: entering _queue_task() for managed_node2/shell 28983 1726882987.91673: Creating lock for shell 28983 1726882987.91880: worker is 1 (out of 1 available) 28983 1726882987.91895: exiting _queue_task() for managed_node2/shell 28983 1726882987.91907: done queuing things up, now waiting for results queue to drain 28983 1726882987.91909: waiting for pending results... 28983 1726882987.92083: running TaskExecutor() for managed_node2/TASK: Get NM profile info 28983 1726882987.92169: in run() - task 0affe814-3a2d-b16d-c0a7-000000000401 28983 1726882987.92185: variable 'ansible_search_path' from source: unknown 28983 1726882987.92189: variable 'ansible_search_path' from source: unknown 28983 1726882987.92217: calling self._execute() 28983 1726882987.92295: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882987.92301: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882987.92312: variable 'omit' from source: magic vars 28983 1726882987.92614: variable 'ansible_distribution_major_version' from source: facts 28983 1726882987.92625: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882987.92632: variable 'omit' from source: magic vars 28983 1726882987.92685: variable 'omit' from source: magic vars 28983 1726882987.92766: variable 'profile' from source: play vars 28983 1726882987.92770: variable 'interface' from source: play vars 28983 1726882987.92835: variable 'interface' from source: play vars 28983 1726882987.92851: variable 'omit' from source: magic vars 28983 1726882987.92892: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726882987.92926: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726882987.92945: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726882987.92961: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882987.92971: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882987.93003: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726882987.93007: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882987.93010: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882987.93092: Set connection var ansible_connection to ssh 28983 1726882987.93102: Set connection var ansible_shell_executable to /bin/sh 28983 1726882987.93111: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726882987.93121: Set connection var ansible_timeout to 10 28983 1726882987.93130: Set connection var ansible_pipelining to False 28983 1726882987.93135: Set connection var ansible_shell_type to sh 28983 1726882987.93153: variable 'ansible_shell_executable' from source: unknown 28983 1726882987.93157: variable 'ansible_connection' from source: unknown 28983 1726882987.93160: variable 'ansible_module_compression' from source: unknown 28983 1726882987.93162: variable 'ansible_shell_type' from source: unknown 28983 1726882987.93167: variable 'ansible_shell_executable' from source: unknown 28983 1726882987.93169: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882987.93177: variable 'ansible_pipelining' from source: unknown 28983 1726882987.93180: variable 'ansible_timeout' from source: unknown 28983 1726882987.93186: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882987.93301: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726882987.93312: variable 'omit' from source: magic vars 28983 1726882987.93318: starting attempt loop 28983 1726882987.93321: running the handler 28983 1726882987.93331: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726882987.93353: _low_level_execute_command(): starting 28983 1726882987.93358: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726882987.93897: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882987.93901: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882987.93905: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882987.93908: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found <<< 28983 1726882987.93910: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882987.93963: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726882987.93971: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726882987.93973: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882987.94052: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882987.95820: stdout chunk (state=3): >>>/root <<< 28983 1726882987.95927: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882987.95980: stderr chunk (state=3): >>><<< 28983 1726882987.95984: stdout chunk (state=3): >>><<< 28983 1726882987.96005: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726882987.96016: _low_level_execute_command(): starting 28983 1726882987.96022: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882987.9600446-29694-52681230180015 `" && echo ansible-tmp-1726882987.9600446-29694-52681230180015="` echo /root/.ansible/tmp/ansible-tmp-1726882987.9600446-29694-52681230180015 `" ) && sleep 0' 28983 1726882987.96491: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726882987.96494: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726882987.96496: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28983 1726882987.96499: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882987.96501: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882987.96555: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726882987.96562: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882987.96637: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882987.98612: stdout chunk (state=3): >>>ansible-tmp-1726882987.9600446-29694-52681230180015=/root/.ansible/tmp/ansible-tmp-1726882987.9600446-29694-52681230180015 <<< 28983 1726882987.98730: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882987.98776: stderr chunk (state=3): >>><<< 28983 1726882987.98785: stdout chunk (state=3): >>><<< 28983 1726882987.98800: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882987.9600446-29694-52681230180015=/root/.ansible/tmp/ansible-tmp-1726882987.9600446-29694-52681230180015 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726882987.98823: variable 'ansible_module_compression' from source: unknown 28983 1726882987.98862: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 28983 1726882987.98901: variable 'ansible_facts' from source: unknown 28983 1726882987.98960: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882987.9600446-29694-52681230180015/AnsiballZ_command.py 28983 1726882987.99063: Sending initial data 28983 1726882987.99066: Sent initial data (155 bytes) 28983 1726882987.99500: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726882987.99503: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882987.99506: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882987.99509: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882987.99566: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726882987.99570: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882987.99637: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882988.01250: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 28983 1726882988.01261: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726882988.01314: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726882988.01389: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmp8y57gq0e /root/.ansible/tmp/ansible-tmp-1726882987.9600446-29694-52681230180015/AnsiballZ_command.py <<< 28983 1726882988.01393: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882987.9600446-29694-52681230180015/AnsiballZ_command.py" <<< 28983 1726882988.01453: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmp8y57gq0e" to remote "/root/.ansible/tmp/ansible-tmp-1726882987.9600446-29694-52681230180015/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882987.9600446-29694-52681230180015/AnsiballZ_command.py" <<< 28983 1726882988.02346: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882988.02403: stderr chunk (state=3): >>><<< 28983 1726882988.02407: stdout chunk (state=3): >>><<< 28983 1726882988.02425: done transferring module to remote 28983 1726882988.02435: _low_level_execute_command(): starting 28983 1726882988.02441: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882987.9600446-29694-52681230180015/ /root/.ansible/tmp/ansible-tmp-1726882987.9600446-29694-52681230180015/AnsiballZ_command.py && sleep 0' 28983 1726882988.02877: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882988.02880: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882988.02886: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28983 1726882988.02889: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found <<< 28983 1726882988.02891: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882988.02940: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726882988.02944: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882988.03020: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882988.04913: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882988.04959: stderr chunk (state=3): >>><<< 28983 1726882988.04962: stdout chunk (state=3): >>><<< 28983 1726882988.04978: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726882988.04981: _low_level_execute_command(): starting 28983 1726882988.04984: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882987.9600446-29694-52681230180015/AnsiballZ_command.py && sleep 0' 28983 1726882988.05414: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882988.05417: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882988.05421: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found <<< 28983 1726882988.05423: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882988.05472: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726882988.05478: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882988.05559: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882988.24729: stdout chunk (state=3): >>> {"changed": true, "stdout": "statebr /etc/NetworkManager/system-connections/statebr.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "start": "2024-09-20 21:43:08.227018", "end": "2024-09-20 21:43:08.246077", "delta": "0:00:00.019059", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 28983 1726882988.26383: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. <<< 28983 1726882988.26442: stderr chunk (state=3): >>><<< 28983 1726882988.26445: stdout chunk (state=3): >>><<< 28983 1726882988.26464: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "statebr /etc/NetworkManager/system-connections/statebr.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "start": "2024-09-20 21:43:08.227018", "end": "2024-09-20 21:43:08.246077", "delta": "0:00:00.019059", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726882988.26502: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882987.9600446-29694-52681230180015/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726882988.26510: _low_level_execute_command(): starting 28983 1726882988.26516: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882987.9600446-29694-52681230180015/ > /dev/null 2>&1 && sleep 0' 28983 1726882988.26989: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882988.26993: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726882988.26995: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration <<< 28983 1726882988.26998: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882988.27040: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726882988.27053: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882988.27123: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882988.29155: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882988.29204: stderr chunk (state=3): >>><<< 28983 1726882988.29217: stdout chunk (state=3): >>><<< 28983 1726882988.29246: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726882988.29268: handler run complete 28983 1726882988.29340: Evaluated conditional (False): False 28983 1726882988.29344: attempt loop complete, returning result 28983 1726882988.29347: _execute() done 28983 1726882988.29352: dumping result to json 28983 1726882988.29370: done dumping result, returning 28983 1726882988.29421: done running TaskExecutor() for managed_node2/TASK: Get NM profile info [0affe814-3a2d-b16d-c0a7-000000000401] 28983 1726882988.29424: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000401 ok: [managed_node2] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "delta": "0:00:00.019059", "end": "2024-09-20 21:43:08.246077", "rc": 0, "start": "2024-09-20 21:43:08.227018" } STDOUT: statebr /etc/NetworkManager/system-connections/statebr.nmconnection 28983 1726882988.29841: no more pending results, returning what we have 28983 1726882988.29851: results queue empty 28983 1726882988.29852: checking for any_errors_fatal 28983 1726882988.29862: done checking for any_errors_fatal 28983 1726882988.29863: checking for max_fail_percentage 28983 1726882988.29866: done checking for max_fail_percentage 28983 1726882988.29866: checking to see if all hosts have failed and the running result is not ok 28983 1726882988.29868: done checking to see if all hosts have failed 28983 1726882988.29868: getting the remaining hosts for this loop 28983 1726882988.29871: done getting the remaining hosts for this loop 28983 1726882988.29880: getting the next task for host managed_node2 28983 1726882988.29888: done getting next task for host managed_node2 28983 1726882988.29892: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 28983 1726882988.29898: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882988.29903: getting variables 28983 1726882988.29905: in VariableManager get_vars() 28983 1726882988.29971: Calling all_inventory to load vars for managed_node2 28983 1726882988.29979: Calling groups_inventory to load vars for managed_node2 28983 1726882988.29984: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882988.30065: Calling all_plugins_play to load vars for managed_node2 28983 1726882988.30069: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882988.30075: Calling groups_plugins_play to load vars for managed_node2 28983 1726882988.30597: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000401 28983 1726882988.30601: WORKER PROCESS EXITING 28983 1726882988.32117: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882988.33718: done with get_vars() 28983 1726882988.33742: done getting variables 28983 1726882988.33794: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 21:43:08 -0400 (0:00:00.421) 0:00:18.336 ****** 28983 1726882988.33822: entering _queue_task() for managed_node2/set_fact 28983 1726882988.34058: worker is 1 (out of 1 available) 28983 1726882988.34072: exiting _queue_task() for managed_node2/set_fact 28983 1726882988.34088: done queuing things up, now waiting for results queue to drain 28983 1726882988.34090: waiting for pending results... 28983 1726882988.34268: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 28983 1726882988.34378: in run() - task 0affe814-3a2d-b16d-c0a7-000000000402 28983 1726882988.34389: variable 'ansible_search_path' from source: unknown 28983 1726882988.34393: variable 'ansible_search_path' from source: unknown 28983 1726882988.34430: calling self._execute() 28983 1726882988.34509: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882988.34514: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882988.34527: variable 'omit' from source: magic vars 28983 1726882988.34849: variable 'ansible_distribution_major_version' from source: facts 28983 1726882988.34874: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882988.34972: variable 'nm_profile_exists' from source: set_fact 28983 1726882988.34988: Evaluated conditional (nm_profile_exists.rc == 0): True 28983 1726882988.34994: variable 'omit' from source: magic vars 28983 1726882988.35038: variable 'omit' from source: magic vars 28983 1726882988.35065: variable 'omit' from source: magic vars 28983 1726882988.35106: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726882988.35138: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726882988.35156: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726882988.35172: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882988.35185: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882988.35215: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726882988.35218: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882988.35223: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882988.35310: Set connection var ansible_connection to ssh 28983 1726882988.35318: Set connection var ansible_shell_executable to /bin/sh 28983 1726882988.35327: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726882988.35337: Set connection var ansible_timeout to 10 28983 1726882988.35343: Set connection var ansible_pipelining to False 28983 1726882988.35346: Set connection var ansible_shell_type to sh 28983 1726882988.35366: variable 'ansible_shell_executable' from source: unknown 28983 1726882988.35369: variable 'ansible_connection' from source: unknown 28983 1726882988.35373: variable 'ansible_module_compression' from source: unknown 28983 1726882988.35379: variable 'ansible_shell_type' from source: unknown 28983 1726882988.35381: variable 'ansible_shell_executable' from source: unknown 28983 1726882988.35386: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882988.35391: variable 'ansible_pipelining' from source: unknown 28983 1726882988.35397: variable 'ansible_timeout' from source: unknown 28983 1726882988.35403: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882988.35520: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726882988.35530: variable 'omit' from source: magic vars 28983 1726882988.35538: starting attempt loop 28983 1726882988.35542: running the handler 28983 1726882988.35555: handler run complete 28983 1726882988.35564: attempt loop complete, returning result 28983 1726882988.35567: _execute() done 28983 1726882988.35570: dumping result to json 28983 1726882988.35578: done dumping result, returning 28983 1726882988.35585: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0affe814-3a2d-b16d-c0a7-000000000402] 28983 1726882988.35591: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000402 28983 1726882988.35680: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000402 28983 1726882988.35684: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 28983 1726882988.35746: no more pending results, returning what we have 28983 1726882988.35749: results queue empty 28983 1726882988.35750: checking for any_errors_fatal 28983 1726882988.35758: done checking for any_errors_fatal 28983 1726882988.35759: checking for max_fail_percentage 28983 1726882988.35761: done checking for max_fail_percentage 28983 1726882988.35762: checking to see if all hosts have failed and the running result is not ok 28983 1726882988.35763: done checking to see if all hosts have failed 28983 1726882988.35764: getting the remaining hosts for this loop 28983 1726882988.35766: done getting the remaining hosts for this loop 28983 1726882988.35771: getting the next task for host managed_node2 28983 1726882988.35780: done getting next task for host managed_node2 28983 1726882988.35783: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 28983 1726882988.35788: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882988.35791: getting variables 28983 1726882988.35793: in VariableManager get_vars() 28983 1726882988.35819: Calling all_inventory to load vars for managed_node2 28983 1726882988.35823: Calling groups_inventory to load vars for managed_node2 28983 1726882988.35826: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882988.35837: Calling all_plugins_play to load vars for managed_node2 28983 1726882988.35840: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882988.35843: Calling groups_plugins_play to load vars for managed_node2 28983 1726882988.37177: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882988.38744: done with get_vars() 28983 1726882988.38765: done getting variables 28983 1726882988.38812: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 28983 1726882988.38908: variable 'profile' from source: play vars 28983 1726882988.38911: variable 'interface' from source: play vars 28983 1726882988.38966: variable 'interface' from source: play vars TASK [Get the ansible_managed comment in ifcfg-statebr] ************************ task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 21:43:08 -0400 (0:00:00.051) 0:00:18.387 ****** 28983 1726882988.38991: entering _queue_task() for managed_node2/command 28983 1726882988.39201: worker is 1 (out of 1 available) 28983 1726882988.39213: exiting _queue_task() for managed_node2/command 28983 1726882988.39227: done queuing things up, now waiting for results queue to drain 28983 1726882988.39229: waiting for pending results... 28983 1726882988.39409: running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-statebr 28983 1726882988.39507: in run() - task 0affe814-3a2d-b16d-c0a7-000000000404 28983 1726882988.39520: variable 'ansible_search_path' from source: unknown 28983 1726882988.39523: variable 'ansible_search_path' from source: unknown 28983 1726882988.39554: calling self._execute() 28983 1726882988.39629: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882988.39636: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882988.39647: variable 'omit' from source: magic vars 28983 1726882988.39944: variable 'ansible_distribution_major_version' from source: facts 28983 1726882988.39955: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882988.40059: variable 'profile_stat' from source: set_fact 28983 1726882988.40068: Evaluated conditional (profile_stat.stat.exists): False 28983 1726882988.40072: when evaluation is False, skipping this task 28983 1726882988.40078: _execute() done 28983 1726882988.40083: dumping result to json 28983 1726882988.40088: done dumping result, returning 28983 1726882988.40094: done running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-statebr [0affe814-3a2d-b16d-c0a7-000000000404] 28983 1726882988.40100: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000404 28983 1726882988.40194: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000404 28983 1726882988.40198: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 28983 1726882988.40268: no more pending results, returning what we have 28983 1726882988.40271: results queue empty 28983 1726882988.40272: checking for any_errors_fatal 28983 1726882988.40277: done checking for any_errors_fatal 28983 1726882988.40278: checking for max_fail_percentage 28983 1726882988.40280: done checking for max_fail_percentage 28983 1726882988.40281: checking to see if all hosts have failed and the running result is not ok 28983 1726882988.40282: done checking to see if all hosts have failed 28983 1726882988.40282: getting the remaining hosts for this loop 28983 1726882988.40284: done getting the remaining hosts for this loop 28983 1726882988.40288: getting the next task for host managed_node2 28983 1726882988.40294: done getting next task for host managed_node2 28983 1726882988.40297: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 28983 1726882988.40301: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882988.40305: getting variables 28983 1726882988.40306: in VariableManager get_vars() 28983 1726882988.40332: Calling all_inventory to load vars for managed_node2 28983 1726882988.40337: Calling groups_inventory to load vars for managed_node2 28983 1726882988.40340: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882988.40349: Calling all_plugins_play to load vars for managed_node2 28983 1726882988.40351: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882988.40354: Calling groups_plugins_play to load vars for managed_node2 28983 1726882988.41530: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882988.43185: done with get_vars() 28983 1726882988.43206: done getting variables 28983 1726882988.43254: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 28983 1726882988.43341: variable 'profile' from source: play vars 28983 1726882988.43344: variable 'interface' from source: play vars 28983 1726882988.43391: variable 'interface' from source: play vars TASK [Verify the ansible_managed comment in ifcfg-statebr] ********************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 21:43:08 -0400 (0:00:00.044) 0:00:18.431 ****** 28983 1726882988.43418: entering _queue_task() for managed_node2/set_fact 28983 1726882988.43640: worker is 1 (out of 1 available) 28983 1726882988.43653: exiting _queue_task() for managed_node2/set_fact 28983 1726882988.43667: done queuing things up, now waiting for results queue to drain 28983 1726882988.43669: waiting for pending results... 28983 1726882988.43849: running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-statebr 28983 1726882988.43942: in run() - task 0affe814-3a2d-b16d-c0a7-000000000405 28983 1726882988.43955: variable 'ansible_search_path' from source: unknown 28983 1726882988.43960: variable 'ansible_search_path' from source: unknown 28983 1726882988.43993: calling self._execute() 28983 1726882988.44071: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882988.44079: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882988.44091: variable 'omit' from source: magic vars 28983 1726882988.44389: variable 'ansible_distribution_major_version' from source: facts 28983 1726882988.44399: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882988.44502: variable 'profile_stat' from source: set_fact 28983 1726882988.44511: Evaluated conditional (profile_stat.stat.exists): False 28983 1726882988.44515: when evaluation is False, skipping this task 28983 1726882988.44518: _execute() done 28983 1726882988.44523: dumping result to json 28983 1726882988.44528: done dumping result, returning 28983 1726882988.44536: done running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-statebr [0affe814-3a2d-b16d-c0a7-000000000405] 28983 1726882988.44543: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000405 28983 1726882988.44633: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000405 28983 1726882988.44639: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 28983 1726882988.44703: no more pending results, returning what we have 28983 1726882988.44707: results queue empty 28983 1726882988.44708: checking for any_errors_fatal 28983 1726882988.44713: done checking for any_errors_fatal 28983 1726882988.44714: checking for max_fail_percentage 28983 1726882988.44716: done checking for max_fail_percentage 28983 1726882988.44717: checking to see if all hosts have failed and the running result is not ok 28983 1726882988.44718: done checking to see if all hosts have failed 28983 1726882988.44719: getting the remaining hosts for this loop 28983 1726882988.44721: done getting the remaining hosts for this loop 28983 1726882988.44725: getting the next task for host managed_node2 28983 1726882988.44731: done getting next task for host managed_node2 28983 1726882988.44736: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 28983 1726882988.44741: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882988.44745: getting variables 28983 1726882988.44746: in VariableManager get_vars() 28983 1726882988.44773: Calling all_inventory to load vars for managed_node2 28983 1726882988.44776: Calling groups_inventory to load vars for managed_node2 28983 1726882988.44779: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882988.44788: Calling all_plugins_play to load vars for managed_node2 28983 1726882988.44791: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882988.44794: Calling groups_plugins_play to load vars for managed_node2 28983 1726882988.45976: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882988.47559: done with get_vars() 28983 1726882988.47580: done getting variables 28983 1726882988.47630: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 28983 1726882988.47712: variable 'profile' from source: play vars 28983 1726882988.47715: variable 'interface' from source: play vars 28983 1726882988.47764: variable 'interface' from source: play vars TASK [Get the fingerprint comment in ifcfg-statebr] **************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 21:43:08 -0400 (0:00:00.043) 0:00:18.475 ****** 28983 1726882988.47789: entering _queue_task() for managed_node2/command 28983 1726882988.47994: worker is 1 (out of 1 available) 28983 1726882988.48007: exiting _queue_task() for managed_node2/command 28983 1726882988.48020: done queuing things up, now waiting for results queue to drain 28983 1726882988.48022: waiting for pending results... 28983 1726882988.48207: running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-statebr 28983 1726882988.48298: in run() - task 0affe814-3a2d-b16d-c0a7-000000000406 28983 1726882988.48310: variable 'ansible_search_path' from source: unknown 28983 1726882988.48313: variable 'ansible_search_path' from source: unknown 28983 1726882988.48344: calling self._execute() 28983 1726882988.48423: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882988.48429: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882988.48443: variable 'omit' from source: magic vars 28983 1726882988.48741: variable 'ansible_distribution_major_version' from source: facts 28983 1726882988.48751: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882988.48857: variable 'profile_stat' from source: set_fact 28983 1726882988.48868: Evaluated conditional (profile_stat.stat.exists): False 28983 1726882988.48871: when evaluation is False, skipping this task 28983 1726882988.48874: _execute() done 28983 1726882988.48882: dumping result to json 28983 1726882988.48885: done dumping result, returning 28983 1726882988.48891: done running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-statebr [0affe814-3a2d-b16d-c0a7-000000000406] 28983 1726882988.48899: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000406 28983 1726882988.48990: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000406 28983 1726882988.48993: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 28983 1726882988.49062: no more pending results, returning what we have 28983 1726882988.49066: results queue empty 28983 1726882988.49067: checking for any_errors_fatal 28983 1726882988.49072: done checking for any_errors_fatal 28983 1726882988.49073: checking for max_fail_percentage 28983 1726882988.49075: done checking for max_fail_percentage 28983 1726882988.49076: checking to see if all hosts have failed and the running result is not ok 28983 1726882988.49077: done checking to see if all hosts have failed 28983 1726882988.49078: getting the remaining hosts for this loop 28983 1726882988.49080: done getting the remaining hosts for this loop 28983 1726882988.49083: getting the next task for host managed_node2 28983 1726882988.49090: done getting next task for host managed_node2 28983 1726882988.49092: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 28983 1726882988.49097: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882988.49101: getting variables 28983 1726882988.49102: in VariableManager get_vars() 28983 1726882988.49129: Calling all_inventory to load vars for managed_node2 28983 1726882988.49132: Calling groups_inventory to load vars for managed_node2 28983 1726882988.49137: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882988.49146: Calling all_plugins_play to load vars for managed_node2 28983 1726882988.49148: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882988.49151: Calling groups_plugins_play to load vars for managed_node2 28983 1726882988.50472: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882988.52032: done with get_vars() 28983 1726882988.52055: done getting variables 28983 1726882988.52103: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 28983 1726882988.52183: variable 'profile' from source: play vars 28983 1726882988.52187: variable 'interface' from source: play vars 28983 1726882988.52231: variable 'interface' from source: play vars TASK [Verify the fingerprint comment in ifcfg-statebr] ************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 21:43:08 -0400 (0:00:00.044) 0:00:18.520 ****** 28983 1726882988.52257: entering _queue_task() for managed_node2/set_fact 28983 1726882988.52459: worker is 1 (out of 1 available) 28983 1726882988.52474: exiting _queue_task() for managed_node2/set_fact 28983 1726882988.52489: done queuing things up, now waiting for results queue to drain 28983 1726882988.52491: waiting for pending results... 28983 1726882988.52669: running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-statebr 28983 1726882988.52766: in run() - task 0affe814-3a2d-b16d-c0a7-000000000407 28983 1726882988.52782: variable 'ansible_search_path' from source: unknown 28983 1726882988.52785: variable 'ansible_search_path' from source: unknown 28983 1726882988.52814: calling self._execute() 28983 1726882988.52893: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882988.52899: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882988.52909: variable 'omit' from source: magic vars 28983 1726882988.53208: variable 'ansible_distribution_major_version' from source: facts 28983 1726882988.53218: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882988.53322: variable 'profile_stat' from source: set_fact 28983 1726882988.53332: Evaluated conditional (profile_stat.stat.exists): False 28983 1726882988.53337: when evaluation is False, skipping this task 28983 1726882988.53340: _execute() done 28983 1726882988.53343: dumping result to json 28983 1726882988.53349: done dumping result, returning 28983 1726882988.53355: done running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-statebr [0affe814-3a2d-b16d-c0a7-000000000407] 28983 1726882988.53362: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000407 28983 1726882988.53457: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000407 28983 1726882988.53460: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 28983 1726882988.53520: no more pending results, returning what we have 28983 1726882988.53524: results queue empty 28983 1726882988.53525: checking for any_errors_fatal 28983 1726882988.53530: done checking for any_errors_fatal 28983 1726882988.53531: checking for max_fail_percentage 28983 1726882988.53535: done checking for max_fail_percentage 28983 1726882988.53537: checking to see if all hosts have failed and the running result is not ok 28983 1726882988.53537: done checking to see if all hosts have failed 28983 1726882988.53538: getting the remaining hosts for this loop 28983 1726882988.53540: done getting the remaining hosts for this loop 28983 1726882988.53544: getting the next task for host managed_node2 28983 1726882988.53552: done getting next task for host managed_node2 28983 1726882988.53555: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 28983 1726882988.53559: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882988.53563: getting variables 28983 1726882988.53564: in VariableManager get_vars() 28983 1726882988.53593: Calling all_inventory to load vars for managed_node2 28983 1726882988.53596: Calling groups_inventory to load vars for managed_node2 28983 1726882988.53599: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882988.53608: Calling all_plugins_play to load vars for managed_node2 28983 1726882988.53611: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882988.53613: Calling groups_plugins_play to load vars for managed_node2 28983 1726882988.54789: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882988.56446: done with get_vars() 28983 1726882988.56466: done getting variables 28983 1726882988.56512: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 28983 1726882988.56598: variable 'profile' from source: play vars 28983 1726882988.56601: variable 'interface' from source: play vars 28983 1726882988.56648: variable 'interface' from source: play vars TASK [Assert that the profile is present - 'statebr'] ************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Friday 20 September 2024 21:43:08 -0400 (0:00:00.044) 0:00:18.564 ****** 28983 1726882988.56672: entering _queue_task() for managed_node2/assert 28983 1726882988.56868: worker is 1 (out of 1 available) 28983 1726882988.56883: exiting _queue_task() for managed_node2/assert 28983 1726882988.56899: done queuing things up, now waiting for results queue to drain 28983 1726882988.56901: waiting for pending results... 28983 1726882988.57070: running TaskExecutor() for managed_node2/TASK: Assert that the profile is present - 'statebr' 28983 1726882988.57151: in run() - task 0affe814-3a2d-b16d-c0a7-000000000384 28983 1726882988.57162: variable 'ansible_search_path' from source: unknown 28983 1726882988.57165: variable 'ansible_search_path' from source: unknown 28983 1726882988.57198: calling self._execute() 28983 1726882988.57275: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882988.57284: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882988.57293: variable 'omit' from source: magic vars 28983 1726882988.57583: variable 'ansible_distribution_major_version' from source: facts 28983 1726882988.57593: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882988.57600: variable 'omit' from source: magic vars 28983 1726882988.57641: variable 'omit' from source: magic vars 28983 1726882988.57724: variable 'profile' from source: play vars 28983 1726882988.57728: variable 'interface' from source: play vars 28983 1726882988.57785: variable 'interface' from source: play vars 28983 1726882988.57804: variable 'omit' from source: magic vars 28983 1726882988.57839: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726882988.57868: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726882988.57890: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726882988.57908: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882988.57919: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882988.57947: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726882988.57950: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882988.57956: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882988.58039: Set connection var ansible_connection to ssh 28983 1726882988.58049: Set connection var ansible_shell_executable to /bin/sh 28983 1726882988.58058: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726882988.58066: Set connection var ansible_timeout to 10 28983 1726882988.58072: Set connection var ansible_pipelining to False 28983 1726882988.58079: Set connection var ansible_shell_type to sh 28983 1726882988.58099: variable 'ansible_shell_executable' from source: unknown 28983 1726882988.58103: variable 'ansible_connection' from source: unknown 28983 1726882988.58105: variable 'ansible_module_compression' from source: unknown 28983 1726882988.58108: variable 'ansible_shell_type' from source: unknown 28983 1726882988.58110: variable 'ansible_shell_executable' from source: unknown 28983 1726882988.58122: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882988.58125: variable 'ansible_pipelining' from source: unknown 28983 1726882988.58128: variable 'ansible_timeout' from source: unknown 28983 1726882988.58130: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882988.58245: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726882988.58255: variable 'omit' from source: magic vars 28983 1726882988.58261: starting attempt loop 28983 1726882988.58264: running the handler 28983 1726882988.58358: variable 'lsr_net_profile_exists' from source: set_fact 28983 1726882988.58364: Evaluated conditional (lsr_net_profile_exists): True 28983 1726882988.58371: handler run complete 28983 1726882988.58386: attempt loop complete, returning result 28983 1726882988.58389: _execute() done 28983 1726882988.58392: dumping result to json 28983 1726882988.58397: done dumping result, returning 28983 1726882988.58404: done running TaskExecutor() for managed_node2/TASK: Assert that the profile is present - 'statebr' [0affe814-3a2d-b16d-c0a7-000000000384] 28983 1726882988.58409: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000384 28983 1726882988.58495: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000384 28983 1726882988.58498: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 28983 1726882988.58554: no more pending results, returning what we have 28983 1726882988.58557: results queue empty 28983 1726882988.58558: checking for any_errors_fatal 28983 1726882988.58564: done checking for any_errors_fatal 28983 1726882988.58565: checking for max_fail_percentage 28983 1726882988.58567: done checking for max_fail_percentage 28983 1726882988.58568: checking to see if all hosts have failed and the running result is not ok 28983 1726882988.58569: done checking to see if all hosts have failed 28983 1726882988.58570: getting the remaining hosts for this loop 28983 1726882988.58571: done getting the remaining hosts for this loop 28983 1726882988.58575: getting the next task for host managed_node2 28983 1726882988.58581: done getting next task for host managed_node2 28983 1726882988.58584: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 28983 1726882988.58588: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882988.58592: getting variables 28983 1726882988.58593: in VariableManager get_vars() 28983 1726882988.58627: Calling all_inventory to load vars for managed_node2 28983 1726882988.58631: Calling groups_inventory to load vars for managed_node2 28983 1726882988.58636: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882988.58645: Calling all_plugins_play to load vars for managed_node2 28983 1726882988.58648: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882988.58652: Calling groups_plugins_play to load vars for managed_node2 28983 1726882988.59832: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882988.61406: done with get_vars() 28983 1726882988.61428: done getting variables 28983 1726882988.61473: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 28983 1726882988.61556: variable 'profile' from source: play vars 28983 1726882988.61559: variable 'interface' from source: play vars 28983 1726882988.61604: variable 'interface' from source: play vars TASK [Assert that the ansible managed comment is present in 'statebr'] ********* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Friday 20 September 2024 21:43:08 -0400 (0:00:00.049) 0:00:18.614 ****** 28983 1726882988.61631: entering _queue_task() for managed_node2/assert 28983 1726882988.61832: worker is 1 (out of 1 available) 28983 1726882988.61847: exiting _queue_task() for managed_node2/assert 28983 1726882988.61861: done queuing things up, now waiting for results queue to drain 28983 1726882988.61863: waiting for pending results... 28983 1726882988.62032: running TaskExecutor() for managed_node2/TASK: Assert that the ansible managed comment is present in 'statebr' 28983 1726882988.62117: in run() - task 0affe814-3a2d-b16d-c0a7-000000000385 28983 1726882988.62130: variable 'ansible_search_path' from source: unknown 28983 1726882988.62135: variable 'ansible_search_path' from source: unknown 28983 1726882988.62164: calling self._execute() 28983 1726882988.62239: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882988.62246: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882988.62256: variable 'omit' from source: magic vars 28983 1726882988.62546: variable 'ansible_distribution_major_version' from source: facts 28983 1726882988.62558: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882988.62564: variable 'omit' from source: magic vars 28983 1726882988.62602: variable 'omit' from source: magic vars 28983 1726882988.62687: variable 'profile' from source: play vars 28983 1726882988.62691: variable 'interface' from source: play vars 28983 1726882988.62747: variable 'interface' from source: play vars 28983 1726882988.62764: variable 'omit' from source: magic vars 28983 1726882988.62801: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726882988.62830: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726882988.62851: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726882988.62869: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882988.62882: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882988.62908: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726882988.62912: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882988.62916: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882988.63000: Set connection var ansible_connection to ssh 28983 1726882988.63010: Set connection var ansible_shell_executable to /bin/sh 28983 1726882988.63018: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726882988.63027: Set connection var ansible_timeout to 10 28983 1726882988.63033: Set connection var ansible_pipelining to False 28983 1726882988.63038: Set connection var ansible_shell_type to sh 28983 1726882988.63056: variable 'ansible_shell_executable' from source: unknown 28983 1726882988.63059: variable 'ansible_connection' from source: unknown 28983 1726882988.63063: variable 'ansible_module_compression' from source: unknown 28983 1726882988.63065: variable 'ansible_shell_type' from source: unknown 28983 1726882988.63070: variable 'ansible_shell_executable' from source: unknown 28983 1726882988.63073: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882988.63083: variable 'ansible_pipelining' from source: unknown 28983 1726882988.63086: variable 'ansible_timeout' from source: unknown 28983 1726882988.63091: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882988.63205: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726882988.63216: variable 'omit' from source: magic vars 28983 1726882988.63222: starting attempt loop 28983 1726882988.63225: running the handler 28983 1726882988.63318: variable 'lsr_net_profile_ansible_managed' from source: set_fact 28983 1726882988.63323: Evaluated conditional (lsr_net_profile_ansible_managed): True 28983 1726882988.63330: handler run complete 28983 1726882988.63344: attempt loop complete, returning result 28983 1726882988.63347: _execute() done 28983 1726882988.63350: dumping result to json 28983 1726882988.63356: done dumping result, returning 28983 1726882988.63363: done running TaskExecutor() for managed_node2/TASK: Assert that the ansible managed comment is present in 'statebr' [0affe814-3a2d-b16d-c0a7-000000000385] 28983 1726882988.63368: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000385 28983 1726882988.63461: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000385 28983 1726882988.63464: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 28983 1726882988.63550: no more pending results, returning what we have 28983 1726882988.63553: results queue empty 28983 1726882988.63554: checking for any_errors_fatal 28983 1726882988.63559: done checking for any_errors_fatal 28983 1726882988.63560: checking for max_fail_percentage 28983 1726882988.63562: done checking for max_fail_percentage 28983 1726882988.63563: checking to see if all hosts have failed and the running result is not ok 28983 1726882988.63564: done checking to see if all hosts have failed 28983 1726882988.63565: getting the remaining hosts for this loop 28983 1726882988.63566: done getting the remaining hosts for this loop 28983 1726882988.63570: getting the next task for host managed_node2 28983 1726882988.63578: done getting next task for host managed_node2 28983 1726882988.63580: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 28983 1726882988.63585: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882988.63589: getting variables 28983 1726882988.63590: in VariableManager get_vars() 28983 1726882988.63616: Calling all_inventory to load vars for managed_node2 28983 1726882988.63618: Calling groups_inventory to load vars for managed_node2 28983 1726882988.63621: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882988.63628: Calling all_plugins_play to load vars for managed_node2 28983 1726882988.63630: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882988.63632: Calling groups_plugins_play to load vars for managed_node2 28983 1726882988.64947: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882988.66507: done with get_vars() 28983 1726882988.66529: done getting variables 28983 1726882988.66580: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 28983 1726882988.66667: variable 'profile' from source: play vars 28983 1726882988.66670: variable 'interface' from source: play vars 28983 1726882988.66720: variable 'interface' from source: play vars TASK [Assert that the fingerprint comment is present in statebr] *************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Friday 20 September 2024 21:43:08 -0400 (0:00:00.051) 0:00:18.665 ****** 28983 1726882988.66748: entering _queue_task() for managed_node2/assert 28983 1726882988.66984: worker is 1 (out of 1 available) 28983 1726882988.67000: exiting _queue_task() for managed_node2/assert 28983 1726882988.67014: done queuing things up, now waiting for results queue to drain 28983 1726882988.67016: waiting for pending results... 28983 1726882988.67209: running TaskExecutor() for managed_node2/TASK: Assert that the fingerprint comment is present in statebr 28983 1726882988.67302: in run() - task 0affe814-3a2d-b16d-c0a7-000000000386 28983 1726882988.67314: variable 'ansible_search_path' from source: unknown 28983 1726882988.67318: variable 'ansible_search_path' from source: unknown 28983 1726882988.67351: calling self._execute() 28983 1726882988.67430: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882988.67437: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882988.67447: variable 'omit' from source: magic vars 28983 1726882988.67752: variable 'ansible_distribution_major_version' from source: facts 28983 1726882988.67762: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882988.67768: variable 'omit' from source: magic vars 28983 1726882988.67814: variable 'omit' from source: magic vars 28983 1726882988.67895: variable 'profile' from source: play vars 28983 1726882988.67902: variable 'interface' from source: play vars 28983 1726882988.67956: variable 'interface' from source: play vars 28983 1726882988.67971: variable 'omit' from source: magic vars 28983 1726882988.68013: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726882988.68046: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726882988.68064: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726882988.68085: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882988.68095: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882988.68124: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726882988.68128: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882988.68130: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882988.68213: Set connection var ansible_connection to ssh 28983 1726882988.68224: Set connection var ansible_shell_executable to /bin/sh 28983 1726882988.68233: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726882988.68243: Set connection var ansible_timeout to 10 28983 1726882988.68251: Set connection var ansible_pipelining to False 28983 1726882988.68254: Set connection var ansible_shell_type to sh 28983 1726882988.68276: variable 'ansible_shell_executable' from source: unknown 28983 1726882988.68281: variable 'ansible_connection' from source: unknown 28983 1726882988.68284: variable 'ansible_module_compression' from source: unknown 28983 1726882988.68288: variable 'ansible_shell_type' from source: unknown 28983 1726882988.68291: variable 'ansible_shell_executable' from source: unknown 28983 1726882988.68295: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882988.68300: variable 'ansible_pipelining' from source: unknown 28983 1726882988.68303: variable 'ansible_timeout' from source: unknown 28983 1726882988.68309: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882988.68425: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726882988.68438: variable 'omit' from source: magic vars 28983 1726882988.68445: starting attempt loop 28983 1726882988.68448: running the handler 28983 1726882988.68543: variable 'lsr_net_profile_fingerprint' from source: set_fact 28983 1726882988.68548: Evaluated conditional (lsr_net_profile_fingerprint): True 28983 1726882988.68555: handler run complete 28983 1726882988.68569: attempt loop complete, returning result 28983 1726882988.68572: _execute() done 28983 1726882988.68578: dumping result to json 28983 1726882988.68590: done dumping result, returning 28983 1726882988.68593: done running TaskExecutor() for managed_node2/TASK: Assert that the fingerprint comment is present in statebr [0affe814-3a2d-b16d-c0a7-000000000386] 28983 1726882988.68596: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000386 28983 1726882988.68685: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000386 28983 1726882988.68688: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 28983 1726882988.68755: no more pending results, returning what we have 28983 1726882988.68758: results queue empty 28983 1726882988.68759: checking for any_errors_fatal 28983 1726882988.68765: done checking for any_errors_fatal 28983 1726882988.68766: checking for max_fail_percentage 28983 1726882988.68768: done checking for max_fail_percentage 28983 1726882988.68770: checking to see if all hosts have failed and the running result is not ok 28983 1726882988.68770: done checking to see if all hosts have failed 28983 1726882988.68772: getting the remaining hosts for this loop 28983 1726882988.68773: done getting the remaining hosts for this loop 28983 1726882988.68778: getting the next task for host managed_node2 28983 1726882988.68787: done getting next task for host managed_node2 28983 1726882988.68790: ^ task is: TASK: Conditional asserts 28983 1726882988.68793: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882988.68797: getting variables 28983 1726882988.68799: in VariableManager get_vars() 28983 1726882988.68827: Calling all_inventory to load vars for managed_node2 28983 1726882988.68829: Calling groups_inventory to load vars for managed_node2 28983 1726882988.68833: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882988.68844: Calling all_plugins_play to load vars for managed_node2 28983 1726882988.68847: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882988.68851: Calling groups_plugins_play to load vars for managed_node2 28983 1726882988.70082: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882988.76110: done with get_vars() 28983 1726882988.76148: done getting variables TASK [Conditional asserts] ***************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:42 Friday 20 September 2024 21:43:08 -0400 (0:00:00.094) 0:00:18.760 ****** 28983 1726882988.76242: entering _queue_task() for managed_node2/include_tasks 28983 1726882988.76584: worker is 1 (out of 1 available) 28983 1726882988.76599: exiting _queue_task() for managed_node2/include_tasks 28983 1726882988.76613: done queuing things up, now waiting for results queue to drain 28983 1726882988.76615: waiting for pending results... 28983 1726882988.76958: running TaskExecutor() for managed_node2/TASK: Conditional asserts 28983 1726882988.77079: in run() - task 0affe814-3a2d-b16d-c0a7-000000000097 28983 1726882988.77141: variable 'ansible_search_path' from source: unknown 28983 1726882988.77145: variable 'ansible_search_path' from source: unknown 28983 1726882988.77478: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726882988.79277: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726882988.79332: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726882988.79375: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726882988.79408: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726882988.79432: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726882988.79502: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726882988.79530: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726882988.79555: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726882988.79603: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726882988.79619: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726882988.79877: variable 'lsr_assert_when' from source: include params 28983 1726882988.79880: variable 'network_provider' from source: set_fact 28983 1726882988.80140: variable 'omit' from source: magic vars 28983 1726882988.80144: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882988.80147: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882988.80149: variable 'omit' from source: magic vars 28983 1726882988.80365: variable 'ansible_distribution_major_version' from source: facts 28983 1726882988.80381: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882988.80520: variable 'item' from source: unknown 28983 1726882988.80531: Evaluated conditional (item['condition']): True 28983 1726882988.80632: variable 'item' from source: unknown 28983 1726882988.80674: variable 'item' from source: unknown 28983 1726882988.80743: variable 'item' from source: unknown 28983 1726882988.80908: dumping result to json 28983 1726882988.80911: done dumping result, returning 28983 1726882988.80924: done running TaskExecutor() for managed_node2/TASK: Conditional asserts [0affe814-3a2d-b16d-c0a7-000000000097] 28983 1726882988.80927: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000097 28983 1726882988.80985: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000097 28983 1726882988.80988: WORKER PROCESS EXITING 28983 1726882988.81055: no more pending results, returning what we have 28983 1726882988.81060: in VariableManager get_vars() 28983 1726882988.81092: Calling all_inventory to load vars for managed_node2 28983 1726882988.81095: Calling groups_inventory to load vars for managed_node2 28983 1726882988.81098: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882988.81107: Calling all_plugins_play to load vars for managed_node2 28983 1726882988.81111: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882988.81114: Calling groups_plugins_play to load vars for managed_node2 28983 1726882988.82338: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882988.84644: done with get_vars() 28983 1726882988.84664: variable 'ansible_search_path' from source: unknown 28983 1726882988.84665: variable 'ansible_search_path' from source: unknown 28983 1726882988.84699: we have included files to process 28983 1726882988.84700: generating all_blocks data 28983 1726882988.84702: done generating all_blocks data 28983 1726882988.84707: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 28983 1726882988.84708: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 28983 1726882988.84709: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 28983 1726882988.84851: in VariableManager get_vars() 28983 1726882988.84867: done with get_vars() 28983 1726882988.84960: done processing included file 28983 1726882988.84962: iterating over new_blocks loaded from include file 28983 1726882988.84963: in VariableManager get_vars() 28983 1726882988.84976: done with get_vars() 28983 1726882988.84977: filtering new block on tags 28983 1726882988.85007: done filtering new block on tags 28983 1726882988.85009: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed_node2 => (item={'what': 'tasks/assert_device_present.yml', 'condition': True}) 28983 1726882988.85013: extending task lists for all hosts with included blocks 28983 1726882988.86146: done extending task lists 28983 1726882988.86147: done processing included files 28983 1726882988.86148: results queue empty 28983 1726882988.86149: checking for any_errors_fatal 28983 1726882988.86153: done checking for any_errors_fatal 28983 1726882988.86154: checking for max_fail_percentage 28983 1726882988.86156: done checking for max_fail_percentage 28983 1726882988.86157: checking to see if all hosts have failed and the running result is not ok 28983 1726882988.86157: done checking to see if all hosts have failed 28983 1726882988.86158: getting the remaining hosts for this loop 28983 1726882988.86160: done getting the remaining hosts for this loop 28983 1726882988.86163: getting the next task for host managed_node2 28983 1726882988.86168: done getting next task for host managed_node2 28983 1726882988.86171: ^ task is: TASK: Include the task 'get_interface_stat.yml' 28983 1726882988.86174: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882988.86181: getting variables 28983 1726882988.86182: in VariableManager get_vars() 28983 1726882988.86192: Calling all_inventory to load vars for managed_node2 28983 1726882988.86195: Calling groups_inventory to load vars for managed_node2 28983 1726882988.86198: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882988.86204: Calling all_plugins_play to load vars for managed_node2 28983 1726882988.86207: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882988.86211: Calling groups_plugins_play to load vars for managed_node2 28983 1726882988.88439: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882988.91038: done with get_vars() 28983 1726882988.91059: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 21:43:08 -0400 (0:00:00.148) 0:00:18.909 ****** 28983 1726882988.91127: entering _queue_task() for managed_node2/include_tasks 28983 1726882988.91396: worker is 1 (out of 1 available) 28983 1726882988.91410: exiting _queue_task() for managed_node2/include_tasks 28983 1726882988.91424: done queuing things up, now waiting for results queue to drain 28983 1726882988.91425: waiting for pending results... 28983 1726882988.91613: running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' 28983 1726882988.91712: in run() - task 0affe814-3a2d-b16d-c0a7-000000000452 28983 1726882988.91724: variable 'ansible_search_path' from source: unknown 28983 1726882988.91952: variable 'ansible_search_path' from source: unknown 28983 1726882988.91957: calling self._execute() 28983 1726882988.91960: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882988.91963: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882988.91966: variable 'omit' from source: magic vars 28983 1726882988.92940: variable 'ansible_distribution_major_version' from source: facts 28983 1726882988.92944: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882988.92947: _execute() done 28983 1726882988.92951: dumping result to json 28983 1726882988.92954: done dumping result, returning 28983 1726882988.92957: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' [0affe814-3a2d-b16d-c0a7-000000000452] 28983 1726882988.92960: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000452 28983 1726882988.93038: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000452 28983 1726882988.93042: WORKER PROCESS EXITING 28983 1726882988.93077: no more pending results, returning what we have 28983 1726882988.93083: in VariableManager get_vars() 28983 1726882988.93122: Calling all_inventory to load vars for managed_node2 28983 1726882988.93126: Calling groups_inventory to load vars for managed_node2 28983 1726882988.93130: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882988.93247: Calling all_plugins_play to load vars for managed_node2 28983 1726882988.93251: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882988.93255: Calling groups_plugins_play to load vars for managed_node2 28983 1726882988.95936: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882988.98982: done with get_vars() 28983 1726882988.99013: variable 'ansible_search_path' from source: unknown 28983 1726882988.99014: variable 'ansible_search_path' from source: unknown 28983 1726882988.99176: variable 'item' from source: include params 28983 1726882988.99220: we have included files to process 28983 1726882988.99222: generating all_blocks data 28983 1726882988.99224: done generating all_blocks data 28983 1726882988.99226: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 28983 1726882988.99228: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 28983 1726882988.99230: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 28983 1726882988.99451: done processing included file 28983 1726882988.99454: iterating over new_blocks loaded from include file 28983 1726882988.99455: in VariableManager get_vars() 28983 1726882988.99474: done with get_vars() 28983 1726882988.99476: filtering new block on tags 28983 1726882988.99507: done filtering new block on tags 28983 1726882988.99510: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node2 28983 1726882988.99516: extending task lists for all hosts with included blocks 28983 1726882988.99729: done extending task lists 28983 1726882988.99731: done processing included files 28983 1726882988.99732: results queue empty 28983 1726882988.99732: checking for any_errors_fatal 28983 1726882988.99738: done checking for any_errors_fatal 28983 1726882988.99740: checking for max_fail_percentage 28983 1726882988.99741: done checking for max_fail_percentage 28983 1726882988.99742: checking to see if all hosts have failed and the running result is not ok 28983 1726882988.99743: done checking to see if all hosts have failed 28983 1726882988.99744: getting the remaining hosts for this loop 28983 1726882988.99746: done getting the remaining hosts for this loop 28983 1726882988.99749: getting the next task for host managed_node2 28983 1726882988.99754: done getting next task for host managed_node2 28983 1726882988.99757: ^ task is: TASK: Get stat for interface {{ interface }} 28983 1726882988.99761: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882988.99764: getting variables 28983 1726882988.99765: in VariableManager get_vars() 28983 1726882988.99775: Calling all_inventory to load vars for managed_node2 28983 1726882988.99778: Calling groups_inventory to load vars for managed_node2 28983 1726882988.99781: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882988.99787: Calling all_plugins_play to load vars for managed_node2 28983 1726882988.99790: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882988.99794: Calling groups_plugins_play to load vars for managed_node2 28983 1726882989.01806: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882989.04580: done with get_vars() 28983 1726882989.04614: done getting variables 28983 1726882989.04769: variable 'interface' from source: play vars TASK [Get stat for interface statebr] ****************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:43:09 -0400 (0:00:00.136) 0:00:19.045 ****** 28983 1726882989.04807: entering _queue_task() for managed_node2/stat 28983 1726882989.05219: worker is 1 (out of 1 available) 28983 1726882989.05233: exiting _queue_task() for managed_node2/stat 28983 1726882989.05250: done queuing things up, now waiting for results queue to drain 28983 1726882989.05252: waiting for pending results... 28983 1726882989.05691: running TaskExecutor() for managed_node2/TASK: Get stat for interface statebr 28983 1726882989.05794: in run() - task 0affe814-3a2d-b16d-c0a7-0000000004e8 28983 1726882989.05807: variable 'ansible_search_path' from source: unknown 28983 1726882989.05811: variable 'ansible_search_path' from source: unknown 28983 1726882989.05853: calling self._execute() 28983 1726882989.05935: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882989.05942: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882989.05954: variable 'omit' from source: magic vars 28983 1726882989.06275: variable 'ansible_distribution_major_version' from source: facts 28983 1726882989.06288: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882989.06294: variable 'omit' from source: magic vars 28983 1726882989.06344: variable 'omit' from source: magic vars 28983 1726882989.06426: variable 'interface' from source: play vars 28983 1726882989.06443: variable 'omit' from source: magic vars 28983 1726882989.06485: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726882989.06519: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726882989.06541: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726882989.06557: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882989.06567: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882989.06597: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726882989.06601: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882989.06605: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882989.06691: Set connection var ansible_connection to ssh 28983 1726882989.06701: Set connection var ansible_shell_executable to /bin/sh 28983 1726882989.06710: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726882989.06720: Set connection var ansible_timeout to 10 28983 1726882989.06726: Set connection var ansible_pipelining to False 28983 1726882989.06730: Set connection var ansible_shell_type to sh 28983 1726882989.06751: variable 'ansible_shell_executable' from source: unknown 28983 1726882989.06755: variable 'ansible_connection' from source: unknown 28983 1726882989.06757: variable 'ansible_module_compression' from source: unknown 28983 1726882989.06762: variable 'ansible_shell_type' from source: unknown 28983 1726882989.06765: variable 'ansible_shell_executable' from source: unknown 28983 1726882989.06769: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882989.06774: variable 'ansible_pipelining' from source: unknown 28983 1726882989.06780: variable 'ansible_timeout' from source: unknown 28983 1726882989.06785: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882989.06952: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28983 1726882989.06962: variable 'omit' from source: magic vars 28983 1726882989.06970: starting attempt loop 28983 1726882989.06973: running the handler 28983 1726882989.06987: _low_level_execute_command(): starting 28983 1726882989.06996: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726882989.07524: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726882989.07611: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882989.07663: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882989.07739: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882989.09532: stdout chunk (state=3): >>>/root <<< 28983 1726882989.09644: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882989.09691: stderr chunk (state=3): >>><<< 28983 1726882989.09698: stdout chunk (state=3): >>><<< 28983 1726882989.09725: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726882989.09736: _low_level_execute_command(): starting 28983 1726882989.09745: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882989.0972068-29730-47236672633946 `" && echo ansible-tmp-1726882989.0972068-29730-47236672633946="` echo /root/.ansible/tmp/ansible-tmp-1726882989.0972068-29730-47236672633946 `" ) && sleep 0' 28983 1726882989.10176: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726882989.10180: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882989.10183: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882989.10191: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882989.10243: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726882989.10250: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882989.10330: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882989.12352: stdout chunk (state=3): >>>ansible-tmp-1726882989.0972068-29730-47236672633946=/root/.ansible/tmp/ansible-tmp-1726882989.0972068-29730-47236672633946 <<< 28983 1726882989.12508: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882989.12515: stderr chunk (state=3): >>><<< 28983 1726882989.12519: stdout chunk (state=3): >>><<< 28983 1726882989.12549: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882989.0972068-29730-47236672633946=/root/.ansible/tmp/ansible-tmp-1726882989.0972068-29730-47236672633946 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726882989.12587: variable 'ansible_module_compression' from source: unknown 28983 1726882989.12636: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 28983 1726882989.12668: variable 'ansible_facts' from source: unknown 28983 1726882989.12723: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882989.0972068-29730-47236672633946/AnsiballZ_stat.py 28983 1726882989.12826: Sending initial data 28983 1726882989.12830: Sent initial data (152 bytes) 28983 1726882989.13353: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28983 1726882989.13357: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882989.13442: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882989.13503: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882989.15298: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726882989.15369: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726882989.15433: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpmm2co_ix /root/.ansible/tmp/ansible-tmp-1726882989.0972068-29730-47236672633946/AnsiballZ_stat.py <<< 28983 1726882989.15438: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882989.0972068-29730-47236672633946/AnsiballZ_stat.py" <<< 28983 1726882989.15508: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpmm2co_ix" to remote "/root/.ansible/tmp/ansible-tmp-1726882989.0972068-29730-47236672633946/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882989.0972068-29730-47236672633946/AnsiballZ_stat.py" <<< 28983 1726882989.16956: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882989.17166: stderr chunk (state=3): >>><<< 28983 1726882989.17169: stdout chunk (state=3): >>><<< 28983 1726882989.17172: done transferring module to remote 28983 1726882989.17174: _low_level_execute_command(): starting 28983 1726882989.17176: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882989.0972068-29730-47236672633946/ /root/.ansible/tmp/ansible-tmp-1726882989.0972068-29730-47236672633946/AnsiballZ_stat.py && sleep 0' 28983 1726882989.17693: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726882989.17743: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726882989.17757: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28983 1726882989.17777: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882989.17840: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882989.17890: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726882989.17937: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882989.17994: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882989.19918: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882989.19969: stderr chunk (state=3): >>><<< 28983 1726882989.19971: stdout chunk (state=3): >>><<< 28983 1726882989.20013: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726882989.20016: _low_level_execute_command(): starting 28983 1726882989.20019: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882989.0972068-29730-47236672633946/AnsiballZ_stat.py && sleep 0' 28983 1726882989.20414: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882989.20417: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882989.20419: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882989.20422: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882989.20476: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726882989.20484: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882989.20557: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882989.37915: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/statebr", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 38024, "dev": 23, "nlink": 1, "atime": 1726882985.7773488, "mtime": 1726882985.7773488, "ctime": 1726882985.7773488, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/statebr", "lnk_target": "../../devices/virtual/net/statebr", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 28983 1726882989.39457: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. <<< 28983 1726882989.39517: stderr chunk (state=3): >>><<< 28983 1726882989.39741: stdout chunk (state=3): >>><<< 28983 1726882989.39745: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/statebr", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 38024, "dev": 23, "nlink": 1, "atime": 1726882985.7773488, "mtime": 1726882985.7773488, "ctime": 1726882985.7773488, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/statebr", "lnk_target": "../../devices/virtual/net/statebr", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726882989.39748: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882989.0972068-29730-47236672633946/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726882989.39751: _low_level_execute_command(): starting 28983 1726882989.39753: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882989.0972068-29730-47236672633946/ > /dev/null 2>&1 && sleep 0' 28983 1726882989.40361: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726882989.40379: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726882989.40433: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found <<< 28983 1726882989.40461: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882989.40554: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726882989.40576: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726882989.40601: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882989.40699: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882989.42747: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882989.42982: stdout chunk (state=3): >>><<< 28983 1726882989.42985: stderr chunk (state=3): >>><<< 28983 1726882989.42988: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726882989.42990: handler run complete 28983 1726882989.42992: attempt loop complete, returning result 28983 1726882989.42994: _execute() done 28983 1726882989.42996: dumping result to json 28983 1726882989.42998: done dumping result, returning 28983 1726882989.43000: done running TaskExecutor() for managed_node2/TASK: Get stat for interface statebr [0affe814-3a2d-b16d-c0a7-0000000004e8] 28983 1726882989.43002: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000004e8 28983 1726882989.43441: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000004e8 28983 1726882989.43445: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "atime": 1726882985.7773488, "block_size": 4096, "blocks": 0, "ctime": 1726882985.7773488, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 38024, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/statebr", "lnk_target": "../../devices/virtual/net/statebr", "mode": "0777", "mtime": 1726882985.7773488, "nlink": 1, "path": "/sys/class/net/statebr", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 28983 1726882989.43588: no more pending results, returning what we have 28983 1726882989.43592: results queue empty 28983 1726882989.43593: checking for any_errors_fatal 28983 1726882989.43595: done checking for any_errors_fatal 28983 1726882989.43596: checking for max_fail_percentage 28983 1726882989.43598: done checking for max_fail_percentage 28983 1726882989.43600: checking to see if all hosts have failed and the running result is not ok 28983 1726882989.43601: done checking to see if all hosts have failed 28983 1726882989.43602: getting the remaining hosts for this loop 28983 1726882989.43604: done getting the remaining hosts for this loop 28983 1726882989.43610: getting the next task for host managed_node2 28983 1726882989.43620: done getting next task for host managed_node2 28983 1726882989.43624: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 28983 1726882989.43629: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882989.43640: getting variables 28983 1726882989.43642: in VariableManager get_vars() 28983 1726882989.43684: Calling all_inventory to load vars for managed_node2 28983 1726882989.43688: Calling groups_inventory to load vars for managed_node2 28983 1726882989.43693: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882989.43705: Calling all_plugins_play to load vars for managed_node2 28983 1726882989.43709: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882989.43713: Calling groups_plugins_play to load vars for managed_node2 28983 1726882989.46419: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882989.49591: done with get_vars() 28983 1726882989.49634: done getting variables 28983 1726882989.49707: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 28983 1726882989.49853: variable 'interface' from source: play vars TASK [Assert that the interface is present - 'statebr'] ************************ task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 21:43:09 -0400 (0:00:00.450) 0:00:19.496 ****** 28983 1726882989.49890: entering _queue_task() for managed_node2/assert 28983 1726882989.50269: worker is 1 (out of 1 available) 28983 1726882989.50540: exiting _queue_task() for managed_node2/assert 28983 1726882989.50552: done queuing things up, now waiting for results queue to drain 28983 1726882989.50554: waiting for pending results... 28983 1726882989.50628: running TaskExecutor() for managed_node2/TASK: Assert that the interface is present - 'statebr' 28983 1726882989.50789: in run() - task 0affe814-3a2d-b16d-c0a7-000000000453 28983 1726882989.50812: variable 'ansible_search_path' from source: unknown 28983 1726882989.50821: variable 'ansible_search_path' from source: unknown 28983 1726882989.50877: calling self._execute() 28983 1726882989.51009: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882989.51023: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882989.51044: variable 'omit' from source: magic vars 28983 1726882989.51542: variable 'ansible_distribution_major_version' from source: facts 28983 1726882989.51548: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882989.51555: variable 'omit' from source: magic vars 28983 1726882989.51617: variable 'omit' from source: magic vars 28983 1726882989.51758: variable 'interface' from source: play vars 28983 1726882989.51839: variable 'omit' from source: magic vars 28983 1726882989.51845: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726882989.51904: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726882989.51932: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726882989.51961: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882989.51990: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882989.52033: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726882989.52045: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882989.52055: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882989.52199: Set connection var ansible_connection to ssh 28983 1726882989.52238: Set connection var ansible_shell_executable to /bin/sh 28983 1726882989.52241: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726882989.52253: Set connection var ansible_timeout to 10 28983 1726882989.52264: Set connection var ansible_pipelining to False 28983 1726882989.52308: Set connection var ansible_shell_type to sh 28983 1726882989.52314: variable 'ansible_shell_executable' from source: unknown 28983 1726882989.52322: variable 'ansible_connection' from source: unknown 28983 1726882989.52331: variable 'ansible_module_compression' from source: unknown 28983 1726882989.52341: variable 'ansible_shell_type' from source: unknown 28983 1726882989.52349: variable 'ansible_shell_executable' from source: unknown 28983 1726882989.52357: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882989.52417: variable 'ansible_pipelining' from source: unknown 28983 1726882989.52420: variable 'ansible_timeout' from source: unknown 28983 1726882989.52422: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882989.52580: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726882989.52599: variable 'omit' from source: magic vars 28983 1726882989.52610: starting attempt loop 28983 1726882989.52618: running the handler 28983 1726882989.52813: variable 'interface_stat' from source: set_fact 28983 1726882989.52851: Evaluated conditional (interface_stat.stat.exists): True 28983 1726882989.52868: handler run complete 28983 1726882989.52939: attempt loop complete, returning result 28983 1726882989.52943: _execute() done 28983 1726882989.52946: dumping result to json 28983 1726882989.52948: done dumping result, returning 28983 1726882989.52950: done running TaskExecutor() for managed_node2/TASK: Assert that the interface is present - 'statebr' [0affe814-3a2d-b16d-c0a7-000000000453] 28983 1726882989.52954: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000453 ok: [managed_node2] => { "changed": false } MSG: All assertions passed 28983 1726882989.53132: no more pending results, returning what we have 28983 1726882989.53139: results queue empty 28983 1726882989.53140: checking for any_errors_fatal 28983 1726882989.53152: done checking for any_errors_fatal 28983 1726882989.53153: checking for max_fail_percentage 28983 1726882989.53156: done checking for max_fail_percentage 28983 1726882989.53158: checking to see if all hosts have failed and the running result is not ok 28983 1726882989.53159: done checking to see if all hosts have failed 28983 1726882989.53160: getting the remaining hosts for this loop 28983 1726882989.53162: done getting the remaining hosts for this loop 28983 1726882989.53168: getting the next task for host managed_node2 28983 1726882989.53346: done getting next task for host managed_node2 28983 1726882989.53352: ^ task is: TASK: Success in test '{{ lsr_description }}' 28983 1726882989.53355: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882989.53361: getting variables 28983 1726882989.53362: in VariableManager get_vars() 28983 1726882989.53401: Calling all_inventory to load vars for managed_node2 28983 1726882989.53405: Calling groups_inventory to load vars for managed_node2 28983 1726882989.53409: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882989.53420: Calling all_plugins_play to load vars for managed_node2 28983 1726882989.53424: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882989.53428: Calling groups_plugins_play to load vars for managed_node2 28983 1726882989.53447: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000453 28983 1726882989.53450: WORKER PROCESS EXITING 28983 1726882989.56083: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882989.59242: done with get_vars() 28983 1726882989.59295: done getting variables 28983 1726882989.59384: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 28983 1726882989.59529: variable 'lsr_description' from source: include params TASK [Success in test 'I can create a profile'] ******************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:47 Friday 20 September 2024 21:43:09 -0400 (0:00:00.096) 0:00:19.593 ****** 28983 1726882989.59576: entering _queue_task() for managed_node2/debug 28983 1726882989.59977: worker is 1 (out of 1 available) 28983 1726882989.59995: exiting _queue_task() for managed_node2/debug 28983 1726882989.60008: done queuing things up, now waiting for results queue to drain 28983 1726882989.60011: waiting for pending results... 28983 1726882989.60343: running TaskExecutor() for managed_node2/TASK: Success in test 'I can create a profile' 28983 1726882989.60499: in run() - task 0affe814-3a2d-b16d-c0a7-000000000098 28983 1726882989.60524: variable 'ansible_search_path' from source: unknown 28983 1726882989.60537: variable 'ansible_search_path' from source: unknown 28983 1726882989.60595: calling self._execute() 28983 1726882989.60720: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882989.60736: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882989.60753: variable 'omit' from source: magic vars 28983 1726882989.61227: variable 'ansible_distribution_major_version' from source: facts 28983 1726882989.61253: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882989.61266: variable 'omit' from source: magic vars 28983 1726882989.61325: variable 'omit' from source: magic vars 28983 1726882989.61471: variable 'lsr_description' from source: include params 28983 1726882989.61502: variable 'omit' from source: magic vars 28983 1726882989.61639: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726882989.61643: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726882989.61645: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726882989.61680: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882989.61700: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882989.61744: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726882989.61755: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882989.61768: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882989.61911: Set connection var ansible_connection to ssh 28983 1726882989.61930: Set connection var ansible_shell_executable to /bin/sh 28983 1726882989.61949: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726882989.61966: Set connection var ansible_timeout to 10 28983 1726882989.61985: Set connection var ansible_pipelining to False 28983 1726882989.62003: Set connection var ansible_shell_type to sh 28983 1726882989.62036: variable 'ansible_shell_executable' from source: unknown 28983 1726882989.62102: variable 'ansible_connection' from source: unknown 28983 1726882989.62110: variable 'ansible_module_compression' from source: unknown 28983 1726882989.62113: variable 'ansible_shell_type' from source: unknown 28983 1726882989.62116: variable 'ansible_shell_executable' from source: unknown 28983 1726882989.62119: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882989.62121: variable 'ansible_pipelining' from source: unknown 28983 1726882989.62123: variable 'ansible_timeout' from source: unknown 28983 1726882989.62126: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882989.62300: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726882989.62329: variable 'omit' from source: magic vars 28983 1726882989.62344: starting attempt loop 28983 1726882989.62354: running the handler 28983 1726882989.62415: handler run complete 28983 1726882989.62538: attempt loop complete, returning result 28983 1726882989.62542: _execute() done 28983 1726882989.62550: dumping result to json 28983 1726882989.62552: done dumping result, returning 28983 1726882989.62555: done running TaskExecutor() for managed_node2/TASK: Success in test 'I can create a profile' [0affe814-3a2d-b16d-c0a7-000000000098] 28983 1726882989.62557: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000098 28983 1726882989.62631: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000098 28983 1726882989.62636: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: +++++ Success in test 'I can create a profile' +++++ 28983 1726882989.62709: no more pending results, returning what we have 28983 1726882989.62713: results queue empty 28983 1726882989.62714: checking for any_errors_fatal 28983 1726882989.62724: done checking for any_errors_fatal 28983 1726882989.62726: checking for max_fail_percentage 28983 1726882989.62728: done checking for max_fail_percentage 28983 1726882989.62729: checking to see if all hosts have failed and the running result is not ok 28983 1726882989.62730: done checking to see if all hosts have failed 28983 1726882989.62731: getting the remaining hosts for this loop 28983 1726882989.62735: done getting the remaining hosts for this loop 28983 1726882989.62741: getting the next task for host managed_node2 28983 1726882989.62750: done getting next task for host managed_node2 28983 1726882989.62755: ^ task is: TASK: Cleanup 28983 1726882989.62759: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882989.62882: getting variables 28983 1726882989.62884: in VariableManager get_vars() 28983 1726882989.62922: Calling all_inventory to load vars for managed_node2 28983 1726882989.62926: Calling groups_inventory to load vars for managed_node2 28983 1726882989.62931: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882989.63000: Calling all_plugins_play to load vars for managed_node2 28983 1726882989.63005: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882989.63009: Calling groups_plugins_play to load vars for managed_node2 28983 1726882989.65524: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882989.68649: done with get_vars() 28983 1726882989.68690: done getting variables TASK [Cleanup] ***************************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:66 Friday 20 September 2024 21:43:09 -0400 (0:00:00.092) 0:00:19.685 ****** 28983 1726882989.68810: entering _queue_task() for managed_node2/include_tasks 28983 1726882989.69306: worker is 1 (out of 1 available) 28983 1726882989.69320: exiting _queue_task() for managed_node2/include_tasks 28983 1726882989.69332: done queuing things up, now waiting for results queue to drain 28983 1726882989.69337: waiting for pending results... 28983 1726882989.69615: running TaskExecutor() for managed_node2/TASK: Cleanup 28983 1726882989.69740: in run() - task 0affe814-3a2d-b16d-c0a7-00000000009c 28983 1726882989.69746: variable 'ansible_search_path' from source: unknown 28983 1726882989.69748: variable 'ansible_search_path' from source: unknown 28983 1726882989.69806: variable 'lsr_cleanup' from source: include params 28983 1726882989.70055: variable 'lsr_cleanup' from source: include params 28983 1726882989.70240: variable 'omit' from source: magic vars 28983 1726882989.70321: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882989.70341: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882989.70368: variable 'omit' from source: magic vars 28983 1726882989.70699: variable 'ansible_distribution_major_version' from source: facts 28983 1726882989.70717: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882989.70729: variable 'item' from source: unknown 28983 1726882989.70823: variable 'item' from source: unknown 28983 1726882989.70871: variable 'item' from source: unknown 28983 1726882989.70966: variable 'item' from source: unknown 28983 1726882989.71345: dumping result to json 28983 1726882989.71349: done dumping result, returning 28983 1726882989.71352: done running TaskExecutor() for managed_node2/TASK: Cleanup [0affe814-3a2d-b16d-c0a7-00000000009c] 28983 1726882989.71354: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000009c 28983 1726882989.71402: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000009c 28983 1726882989.71406: WORKER PROCESS EXITING 28983 1726882989.71436: no more pending results, returning what we have 28983 1726882989.71441: in VariableManager get_vars() 28983 1726882989.71487: Calling all_inventory to load vars for managed_node2 28983 1726882989.71491: Calling groups_inventory to load vars for managed_node2 28983 1726882989.71496: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882989.71508: Calling all_plugins_play to load vars for managed_node2 28983 1726882989.71512: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882989.71517: Calling groups_plugins_play to load vars for managed_node2 28983 1726882989.74152: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882989.77256: done with get_vars() 28983 1726882989.77295: variable 'ansible_search_path' from source: unknown 28983 1726882989.77296: variable 'ansible_search_path' from source: unknown 28983 1726882989.77353: we have included files to process 28983 1726882989.77355: generating all_blocks data 28983 1726882989.77358: done generating all_blocks data 28983 1726882989.77364: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 28983 1726882989.77365: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 28983 1726882989.77368: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 28983 1726882989.77686: done processing included file 28983 1726882989.77689: iterating over new_blocks loaded from include file 28983 1726882989.77690: in VariableManager get_vars() 28983 1726882989.77708: done with get_vars() 28983 1726882989.77710: filtering new block on tags 28983 1726882989.77742: done filtering new block on tags 28983 1726882989.77745: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml for managed_node2 => (item=tasks/cleanup_profile+device.yml) 28983 1726882989.77751: extending task lists for all hosts with included blocks 28983 1726882989.79816: done extending task lists 28983 1726882989.79818: done processing included files 28983 1726882989.79819: results queue empty 28983 1726882989.79820: checking for any_errors_fatal 28983 1726882989.79829: done checking for any_errors_fatal 28983 1726882989.79830: checking for max_fail_percentage 28983 1726882989.79832: done checking for max_fail_percentage 28983 1726882989.79833: checking to see if all hosts have failed and the running result is not ok 28983 1726882989.79835: done checking to see if all hosts have failed 28983 1726882989.79836: getting the remaining hosts for this loop 28983 1726882989.79838: done getting the remaining hosts for this loop 28983 1726882989.79841: getting the next task for host managed_node2 28983 1726882989.79847: done getting next task for host managed_node2 28983 1726882989.79850: ^ task is: TASK: Cleanup profile and device 28983 1726882989.79854: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882989.79857: getting variables 28983 1726882989.79858: in VariableManager get_vars() 28983 1726882989.79868: Calling all_inventory to load vars for managed_node2 28983 1726882989.79871: Calling groups_inventory to load vars for managed_node2 28983 1726882989.79877: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882989.79884: Calling all_plugins_play to load vars for managed_node2 28983 1726882989.79887: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882989.79891: Calling groups_plugins_play to load vars for managed_node2 28983 1726882989.82164: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882989.85546: done with get_vars() 28983 1726882989.85587: done getting variables 28983 1726882989.85650: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Cleanup profile and device] ********************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml:3 Friday 20 September 2024 21:43:09 -0400 (0:00:00.168) 0:00:19.854 ****** 28983 1726882989.85686: entering _queue_task() for managed_node2/shell 28983 1726882989.86201: worker is 1 (out of 1 available) 28983 1726882989.86214: exiting _queue_task() for managed_node2/shell 28983 1726882989.86225: done queuing things up, now waiting for results queue to drain 28983 1726882989.86227: waiting for pending results... 28983 1726882989.86452: running TaskExecutor() for managed_node2/TASK: Cleanup profile and device 28983 1726882989.86626: in run() - task 0affe814-3a2d-b16d-c0a7-00000000050b 28983 1726882989.86630: variable 'ansible_search_path' from source: unknown 28983 1726882989.86635: variable 'ansible_search_path' from source: unknown 28983 1726882989.86675: calling self._execute() 28983 1726882989.86843: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882989.86847: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882989.86850: variable 'omit' from source: magic vars 28983 1726882989.87278: variable 'ansible_distribution_major_version' from source: facts 28983 1726882989.87301: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882989.87313: variable 'omit' from source: magic vars 28983 1726882989.87375: variable 'omit' from source: magic vars 28983 1726882989.87578: variable 'interface' from source: play vars 28983 1726882989.87612: variable 'omit' from source: magic vars 28983 1726882989.87669: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726882989.87728: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726882989.87823: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726882989.87826: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882989.87829: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882989.87839: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726882989.87850: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882989.87858: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882989.88041: Set connection var ansible_connection to ssh 28983 1726882989.88047: Set connection var ansible_shell_executable to /bin/sh 28983 1726882989.88052: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726882989.88055: Set connection var ansible_timeout to 10 28983 1726882989.88057: Set connection var ansible_pipelining to False 28983 1726882989.88059: Set connection var ansible_shell_type to sh 28983 1726882989.88085: variable 'ansible_shell_executable' from source: unknown 28983 1726882989.88094: variable 'ansible_connection' from source: unknown 28983 1726882989.88101: variable 'ansible_module_compression' from source: unknown 28983 1726882989.88110: variable 'ansible_shell_type' from source: unknown 28983 1726882989.88117: variable 'ansible_shell_executable' from source: unknown 28983 1726882989.88124: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882989.88132: variable 'ansible_pipelining' from source: unknown 28983 1726882989.88142: variable 'ansible_timeout' from source: unknown 28983 1726882989.88161: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882989.88370: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726882989.88375: variable 'omit' from source: magic vars 28983 1726882989.88381: starting attempt loop 28983 1726882989.88383: running the handler 28983 1726882989.88386: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726882989.88408: _low_level_execute_command(): starting 28983 1726882989.88422: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726882989.89220: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726882989.89271: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration <<< 28983 1726882989.89299: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726882989.89376: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882989.89417: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726882989.89439: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726882989.89460: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882989.89605: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882989.91390: stdout chunk (state=3): >>>/root <<< 28983 1726882989.91779: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882989.91783: stdout chunk (state=3): >>><<< 28983 1726882989.91786: stderr chunk (state=3): >>><<< 28983 1726882989.91790: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726882989.91793: _low_level_execute_command(): starting 28983 1726882989.91797: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882989.9168196-29770-215771438790046 `" && echo ansible-tmp-1726882989.9168196-29770-215771438790046="` echo /root/.ansible/tmp/ansible-tmp-1726882989.9168196-29770-215771438790046 `" ) && sleep 0' 28983 1726882989.93020: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 28983 1726882989.93032: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.46.139 is address <<< 28983 1726882989.93037: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28983 1726882989.93039: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726882989.93042: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882989.93045: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726882989.93047: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726882989.93049: stderr chunk (state=3): >>>debug2: match found <<< 28983 1726882989.93051: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882989.93053: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726882989.93055: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726882989.93148: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882989.93220: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882989.95290: stdout chunk (state=3): >>>ansible-tmp-1726882989.9168196-29770-215771438790046=/root/.ansible/tmp/ansible-tmp-1726882989.9168196-29770-215771438790046 <<< 28983 1726882989.95572: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882989.95576: stdout chunk (state=3): >>><<< 28983 1726882989.95640: stderr chunk (state=3): >>><<< 28983 1726882989.95644: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882989.9168196-29770-215771438790046=/root/.ansible/tmp/ansible-tmp-1726882989.9168196-29770-215771438790046 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726882989.95659: variable 'ansible_module_compression' from source: unknown 28983 1726882989.95719: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 28983 1726882989.95871: variable 'ansible_facts' from source: unknown 28983 1726882989.96039: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882989.9168196-29770-215771438790046/AnsiballZ_command.py 28983 1726882989.96470: Sending initial data 28983 1726882989.96488: Sent initial data (156 bytes) 28983 1726882989.97751: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882989.97888: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726882989.97967: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882989.98037: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882989.99817: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 28983 1726882989.99827: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726882989.99949: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726882989.99974: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpq6h1qc7z /root/.ansible/tmp/ansible-tmp-1726882989.9168196-29770-215771438790046/AnsiballZ_command.py <<< 28983 1726882989.99981: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882989.9168196-29770-215771438790046/AnsiballZ_command.py" <<< 28983 1726882990.00068: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpq6h1qc7z" to remote "/root/.ansible/tmp/ansible-tmp-1726882989.9168196-29770-215771438790046/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882989.9168196-29770-215771438790046/AnsiballZ_command.py" <<< 28983 1726882990.02223: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882990.02449: stderr chunk (state=3): >>><<< 28983 1726882990.02453: stdout chunk (state=3): >>><<< 28983 1726882990.02485: done transferring module to remote 28983 1726882990.02497: _low_level_execute_command(): starting 28983 1726882990.02505: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882989.9168196-29770-215771438790046/ /root/.ansible/tmp/ansible-tmp-1726882989.9168196-29770-215771438790046/AnsiballZ_command.py && sleep 0' 28983 1726882990.03754: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726882990.03758: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882990.03780: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726882990.03787: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882990.03802: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration <<< 28983 1726882990.03810: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726882990.04085: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726882990.04096: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882990.04190: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882990.06140: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882990.06207: stderr chunk (state=3): >>><<< 28983 1726882990.06211: stdout chunk (state=3): >>><<< 28983 1726882990.06230: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726882990.06235: _low_level_execute_command(): starting 28983 1726882990.06242: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882989.9168196-29770-215771438790046/AnsiballZ_command.py && sleep 0' 28983 1726882990.07446: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726882990.07449: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726882990.07452: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882990.07454: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882990.07457: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882990.07748: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882990.07793: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882990.31470: stdout chunk (state=3): >>> {"changed": true, "stdout": "Connection 'statebr' (d5d673f8-3c8b-4cfe-b951-473f5117625f) successfully deleted.", "stderr": "Could not load file '/etc/sysconfig/network-scripts/ifcfg-statebr'\nCannot find device \"statebr\"", "rc": 1, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "start": "2024-09-20 21:43:10.249041", "end": "2024-09-20 21:43:10.312226", "delta": "0:00:00.063185", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 28983 1726882990.33054: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.46.139 closed. <<< 28983 1726882990.33058: stdout chunk (state=3): >>><<< 28983 1726882990.33068: stderr chunk (state=3): >>><<< 28983 1726882990.33093: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "Connection 'statebr' (d5d673f8-3c8b-4cfe-b951-473f5117625f) successfully deleted.", "stderr": "Could not load file '/etc/sysconfig/network-scripts/ifcfg-statebr'\nCannot find device \"statebr\"", "rc": 1, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "start": "2024-09-20 21:43:10.249041", "end": "2024-09-20 21:43:10.312226", "delta": "0:00:00.063185", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.46.139 closed. 28983 1726882990.33151: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882989.9168196-29770-215771438790046/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726882990.33162: _low_level_execute_command(): starting 28983 1726882990.33168: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882989.9168196-29770-215771438790046/ > /dev/null 2>&1 && sleep 0' 28983 1726882990.34790: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726882990.34794: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726882990.34797: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 28983 1726882990.34800: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726882990.34802: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882990.34805: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726882990.34807: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726882990.34992: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882990.35043: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882990.37127: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882990.37130: stdout chunk (state=3): >>><<< 28983 1726882990.37132: stderr chunk (state=3): >>><<< 28983 1726882990.37150: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726882990.37164: handler run complete 28983 1726882990.37541: Evaluated conditional (False): False 28983 1726882990.37544: attempt loop complete, returning result 28983 1726882990.37547: _execute() done 28983 1726882990.37549: dumping result to json 28983 1726882990.37551: done dumping result, returning 28983 1726882990.37554: done running TaskExecutor() for managed_node2/TASK: Cleanup profile and device [0affe814-3a2d-b16d-c0a7-00000000050b] 28983 1726882990.37556: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000050b 28983 1726882990.37649: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000050b 28983 1726882990.37653: WORKER PROCESS EXITING fatal: [managed_node2]: FAILED! => { "changed": false, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "delta": "0:00:00.063185", "end": "2024-09-20 21:43:10.312226", "rc": 1, "start": "2024-09-20 21:43:10.249041" } STDOUT: Connection 'statebr' (d5d673f8-3c8b-4cfe-b951-473f5117625f) successfully deleted. STDERR: Could not load file '/etc/sysconfig/network-scripts/ifcfg-statebr' Cannot find device "statebr" MSG: non-zero return code ...ignoring 28983 1726882990.37749: no more pending results, returning what we have 28983 1726882990.37753: results queue empty 28983 1726882990.37754: checking for any_errors_fatal 28983 1726882990.37756: done checking for any_errors_fatal 28983 1726882990.37758: checking for max_fail_percentage 28983 1726882990.37760: done checking for max_fail_percentage 28983 1726882990.37762: checking to see if all hosts have failed and the running result is not ok 28983 1726882990.37763: done checking to see if all hosts have failed 28983 1726882990.37764: getting the remaining hosts for this loop 28983 1726882990.37766: done getting the remaining hosts for this loop 28983 1726882990.37771: getting the next task for host managed_node2 28983 1726882990.37785: done getting next task for host managed_node2 28983 1726882990.37789: ^ task is: TASK: Include the task 'run_test.yml' 28983 1726882990.37792: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882990.37798: getting variables 28983 1726882990.37799: in VariableManager get_vars() 28983 1726882990.37941: Calling all_inventory to load vars for managed_node2 28983 1726882990.37949: Calling groups_inventory to load vars for managed_node2 28983 1726882990.37953: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882990.37965: Calling all_plugins_play to load vars for managed_node2 28983 1726882990.37968: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882990.37972: Calling groups_plugins_play to load vars for managed_node2 28983 1726882990.42355: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882990.48694: done with get_vars() 28983 1726882990.48738: done getting variables TASK [Include the task 'run_test.yml'] ***************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_states.yml:45 Friday 20 September 2024 21:43:10 -0400 (0:00:00.633) 0:00:20.488 ****** 28983 1726882990.49027: entering _queue_task() for managed_node2/include_tasks 28983 1726882990.49722: worker is 1 (out of 1 available) 28983 1726882990.49871: exiting _queue_task() for managed_node2/include_tasks 28983 1726882990.49885: done queuing things up, now waiting for results queue to drain 28983 1726882990.49887: waiting for pending results... 28983 1726882990.50355: running TaskExecutor() for managed_node2/TASK: Include the task 'run_test.yml' 28983 1726882990.50532: in run() - task 0affe814-3a2d-b16d-c0a7-00000000000f 28983 1726882990.50740: variable 'ansible_search_path' from source: unknown 28983 1726882990.50744: calling self._execute() 28983 1726882990.50747: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882990.50751: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882990.50754: variable 'omit' from source: magic vars 28983 1726882990.51241: variable 'ansible_distribution_major_version' from source: facts 28983 1726882990.51261: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882990.51272: _execute() done 28983 1726882990.51286: dumping result to json 28983 1726882990.51296: done dumping result, returning 28983 1726882990.51308: done running TaskExecutor() for managed_node2/TASK: Include the task 'run_test.yml' [0affe814-3a2d-b16d-c0a7-00000000000f] 28983 1726882990.51321: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000000f 28983 1726882990.51565: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000000f 28983 1726882990.51569: WORKER PROCESS EXITING 28983 1726882990.51600: no more pending results, returning what we have 28983 1726882990.51605: in VariableManager get_vars() 28983 1726882990.51648: Calling all_inventory to load vars for managed_node2 28983 1726882990.51651: Calling groups_inventory to load vars for managed_node2 28983 1726882990.51655: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882990.51667: Calling all_plugins_play to load vars for managed_node2 28983 1726882990.51682: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882990.51686: Calling groups_plugins_play to load vars for managed_node2 28983 1726882990.54205: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882990.57330: done with get_vars() 28983 1726882990.57364: variable 'ansible_search_path' from source: unknown 28983 1726882990.57382: we have included files to process 28983 1726882990.57383: generating all_blocks data 28983 1726882990.57386: done generating all_blocks data 28983 1726882990.57391: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 28983 1726882990.57392: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 28983 1726882990.57395: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 28983 1726882990.57984: in VariableManager get_vars() 28983 1726882990.58005: done with get_vars() 28983 1726882990.58060: in VariableManager get_vars() 28983 1726882990.58088: done with get_vars() 28983 1726882990.58143: in VariableManager get_vars() 28983 1726882990.58162: done with get_vars() 28983 1726882990.58221: in VariableManager get_vars() 28983 1726882990.58244: done with get_vars() 28983 1726882990.58320: in VariableManager get_vars() 28983 1726882990.58343: done with get_vars() 28983 1726882990.58896: in VariableManager get_vars() 28983 1726882990.58916: done with get_vars() 28983 1726882990.58931: done processing included file 28983 1726882990.58938: iterating over new_blocks loaded from include file 28983 1726882990.58940: in VariableManager get_vars() 28983 1726882990.58954: done with get_vars() 28983 1726882990.58956: filtering new block on tags 28983 1726882990.59123: done filtering new block on tags 28983 1726882990.59127: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml for managed_node2 28983 1726882990.59136: extending task lists for all hosts with included blocks 28983 1726882990.59189: done extending task lists 28983 1726882990.59190: done processing included files 28983 1726882990.59191: results queue empty 28983 1726882990.59192: checking for any_errors_fatal 28983 1726882990.59198: done checking for any_errors_fatal 28983 1726882990.59199: checking for max_fail_percentage 28983 1726882990.59201: done checking for max_fail_percentage 28983 1726882990.59202: checking to see if all hosts have failed and the running result is not ok 28983 1726882990.59203: done checking to see if all hosts have failed 28983 1726882990.59204: getting the remaining hosts for this loop 28983 1726882990.59206: done getting the remaining hosts for this loop 28983 1726882990.59209: getting the next task for host managed_node2 28983 1726882990.59213: done getting next task for host managed_node2 28983 1726882990.59216: ^ task is: TASK: TEST: {{ lsr_description }} 28983 1726882990.59219: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882990.59221: getting variables 28983 1726882990.59222: in VariableManager get_vars() 28983 1726882990.59233: Calling all_inventory to load vars for managed_node2 28983 1726882990.59238: Calling groups_inventory to load vars for managed_node2 28983 1726882990.59241: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882990.59247: Calling all_plugins_play to load vars for managed_node2 28983 1726882990.59250: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882990.59254: Calling groups_plugins_play to load vars for managed_node2 28983 1726882990.64066: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882990.70468: done with get_vars() 28983 1726882990.70514: done getting variables 28983 1726882990.70728: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 28983 1726882990.71032: variable 'lsr_description' from source: include params TASK [TEST: I can create a profile without autoconnect] ************************ task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:5 Friday 20 September 2024 21:43:10 -0400 (0:00:00.220) 0:00:20.708 ****** 28983 1726882990.71068: entering _queue_task() for managed_node2/debug 28983 1726882990.71932: worker is 1 (out of 1 available) 28983 1726882990.71947: exiting _queue_task() for managed_node2/debug 28983 1726882990.71959: done queuing things up, now waiting for results queue to drain 28983 1726882990.71961: waiting for pending results... 28983 1726882990.72529: running TaskExecutor() for managed_node2/TASK: TEST: I can create a profile without autoconnect 28983 1726882990.72762: in run() - task 0affe814-3a2d-b16d-c0a7-0000000005b4 28983 1726882990.72778: variable 'ansible_search_path' from source: unknown 28983 1726882990.72783: variable 'ansible_search_path' from source: unknown 28983 1726882990.72817: calling self._execute() 28983 1726882990.72917: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882990.72924: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882990.72937: variable 'omit' from source: magic vars 28983 1726882990.73866: variable 'ansible_distribution_major_version' from source: facts 28983 1726882990.73880: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882990.73890: variable 'omit' from source: magic vars 28983 1726882990.73939: variable 'omit' from source: magic vars 28983 1726882990.74382: variable 'lsr_description' from source: include params 28983 1726882990.74436: variable 'omit' from source: magic vars 28983 1726882990.74458: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726882990.74499: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726882990.74521: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726882990.74664: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882990.74668: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882990.74743: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726882990.74746: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882990.74748: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882990.75042: Set connection var ansible_connection to ssh 28983 1726882990.75056: Set connection var ansible_shell_executable to /bin/sh 28983 1726882990.75069: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726882990.75078: Set connection var ansible_timeout to 10 28983 1726882990.75086: Set connection var ansible_pipelining to False 28983 1726882990.75088: Set connection var ansible_shell_type to sh 28983 1726882990.75116: variable 'ansible_shell_executable' from source: unknown 28983 1726882990.75120: variable 'ansible_connection' from source: unknown 28983 1726882990.75122: variable 'ansible_module_compression' from source: unknown 28983 1726882990.75127: variable 'ansible_shell_type' from source: unknown 28983 1726882990.75130: variable 'ansible_shell_executable' from source: unknown 28983 1726882990.75182: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882990.75185: variable 'ansible_pipelining' from source: unknown 28983 1726882990.75188: variable 'ansible_timeout' from source: unknown 28983 1726882990.75190: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882990.75515: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726882990.75529: variable 'omit' from source: magic vars 28983 1726882990.75537: starting attempt loop 28983 1726882990.75646: running the handler 28983 1726882990.75698: handler run complete 28983 1726882990.75717: attempt loop complete, returning result 28983 1726882990.75720: _execute() done 28983 1726882990.75736: dumping result to json 28983 1726882990.75740: done dumping result, returning 28983 1726882990.75742: done running TaskExecutor() for managed_node2/TASK: TEST: I can create a profile without autoconnect [0affe814-3a2d-b16d-c0a7-0000000005b4] 28983 1726882990.75847: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000005b4 ok: [managed_node2] => {} MSG: ########## I can create a profile without autoconnect ########## 28983 1726882990.76014: no more pending results, returning what we have 28983 1726882990.76018: results queue empty 28983 1726882990.76019: checking for any_errors_fatal 28983 1726882990.76022: done checking for any_errors_fatal 28983 1726882990.76023: checking for max_fail_percentage 28983 1726882990.76025: done checking for max_fail_percentage 28983 1726882990.76026: checking to see if all hosts have failed and the running result is not ok 28983 1726882990.76027: done checking to see if all hosts have failed 28983 1726882990.76028: getting the remaining hosts for this loop 28983 1726882990.76030: done getting the remaining hosts for this loop 28983 1726882990.76042: getting the next task for host managed_node2 28983 1726882990.76055: done getting next task for host managed_node2 28983 1726882990.76063: ^ task is: TASK: Show item 28983 1726882990.76067: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882990.76070: getting variables 28983 1726882990.76072: in VariableManager get_vars() 28983 1726882990.76108: Calling all_inventory to load vars for managed_node2 28983 1726882990.76111: Calling groups_inventory to load vars for managed_node2 28983 1726882990.76115: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882990.76280: Calling all_plugins_play to load vars for managed_node2 28983 1726882990.76284: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882990.76288: Calling groups_plugins_play to load vars for managed_node2 28983 1726882990.76954: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000005b4 28983 1726882990.76958: WORKER PROCESS EXITING 28983 1726882990.80877: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882990.85575: done with get_vars() 28983 1726882990.85613: done getting variables 28983 1726882990.85691: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show item] *************************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:9 Friday 20 September 2024 21:43:10 -0400 (0:00:00.146) 0:00:20.855 ****** 28983 1726882990.85726: entering _queue_task() for managed_node2/debug 28983 1726882990.86129: worker is 1 (out of 1 available) 28983 1726882990.86149: exiting _queue_task() for managed_node2/debug 28983 1726882990.86164: done queuing things up, now waiting for results queue to drain 28983 1726882990.86166: waiting for pending results... 28983 1726882990.86458: running TaskExecutor() for managed_node2/TASK: Show item 28983 1726882990.86597: in run() - task 0affe814-3a2d-b16d-c0a7-0000000005b5 28983 1726882990.86624: variable 'ansible_search_path' from source: unknown 28983 1726882990.86633: variable 'ansible_search_path' from source: unknown 28983 1726882990.86701: variable 'omit' from source: magic vars 28983 1726882990.86889: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882990.86907: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882990.86928: variable 'omit' from source: magic vars 28983 1726882990.87354: variable 'ansible_distribution_major_version' from source: facts 28983 1726882990.87380: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882990.87391: variable 'omit' from source: magic vars 28983 1726882990.87442: variable 'omit' from source: magic vars 28983 1726882990.87507: variable 'item' from source: unknown 28983 1726882990.87696: variable 'item' from source: unknown 28983 1726882990.87700: variable 'omit' from source: magic vars 28983 1726882990.87702: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726882990.87723: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726882990.87750: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726882990.87778: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882990.87796: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882990.87840: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726882990.87850: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882990.87858: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882990.87990: Set connection var ansible_connection to ssh 28983 1726882990.88010: Set connection var ansible_shell_executable to /bin/sh 28983 1726882990.88030: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726882990.88047: Set connection var ansible_timeout to 10 28983 1726882990.88058: Set connection var ansible_pipelining to False 28983 1726882990.88066: Set connection var ansible_shell_type to sh 28983 1726882990.88097: variable 'ansible_shell_executable' from source: unknown 28983 1726882990.88105: variable 'ansible_connection' from source: unknown 28983 1726882990.88113: variable 'ansible_module_compression' from source: unknown 28983 1726882990.88120: variable 'ansible_shell_type' from source: unknown 28983 1726882990.88132: variable 'ansible_shell_executable' from source: unknown 28983 1726882990.88141: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882990.88150: variable 'ansible_pipelining' from source: unknown 28983 1726882990.88239: variable 'ansible_timeout' from source: unknown 28983 1726882990.88245: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882990.88339: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726882990.88361: variable 'omit' from source: magic vars 28983 1726882990.88374: starting attempt loop 28983 1726882990.88383: running the handler 28983 1726882990.88442: variable 'lsr_description' from source: include params 28983 1726882990.88530: variable 'lsr_description' from source: include params 28983 1726882990.88550: handler run complete 28983 1726882990.88584: attempt loop complete, returning result 28983 1726882990.88607: variable 'item' from source: unknown 28983 1726882990.88694: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_description) => { "ansible_loop_var": "item", "item": "lsr_description", "lsr_description": "I can create a profile without autoconnect" } 28983 1726882990.89126: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882990.89130: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882990.89133: variable 'omit' from source: magic vars 28983 1726882990.89195: variable 'ansible_distribution_major_version' from source: facts 28983 1726882990.89218: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882990.89227: variable 'omit' from source: magic vars 28983 1726882990.89254: variable 'omit' from source: magic vars 28983 1726882990.89311: variable 'item' from source: unknown 28983 1726882990.89399: variable 'item' from source: unknown 28983 1726882990.89421: variable 'omit' from source: magic vars 28983 1726882990.89452: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726882990.89466: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882990.89486: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882990.89506: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726882990.89515: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882990.89523: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882990.89654: Set connection var ansible_connection to ssh 28983 1726882990.89671: Set connection var ansible_shell_executable to /bin/sh 28983 1726882990.89694: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726882990.89708: Set connection var ansible_timeout to 10 28983 1726882990.89718: Set connection var ansible_pipelining to False 28983 1726882990.89726: Set connection var ansible_shell_type to sh 28983 1726882990.89754: variable 'ansible_shell_executable' from source: unknown 28983 1726882990.89762: variable 'ansible_connection' from source: unknown 28983 1726882990.89805: variable 'ansible_module_compression' from source: unknown 28983 1726882990.89808: variable 'ansible_shell_type' from source: unknown 28983 1726882990.89810: variable 'ansible_shell_executable' from source: unknown 28983 1726882990.89812: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882990.89814: variable 'ansible_pipelining' from source: unknown 28983 1726882990.89816: variable 'ansible_timeout' from source: unknown 28983 1726882990.89821: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882990.89936: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726882990.89980: variable 'omit' from source: magic vars 28983 1726882990.89983: starting attempt loop 28983 1726882990.89986: running the handler 28983 1726882990.90042: variable 'lsr_setup' from source: include params 28983 1726882990.90118: variable 'lsr_setup' from source: include params 28983 1726882990.90205: handler run complete 28983 1726882990.90220: attempt loop complete, returning result 28983 1726882990.90341: variable 'item' from source: unknown 28983 1726882990.90345: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_setup) => { "ansible_loop_var": "item", "item": "lsr_setup", "lsr_setup": [ "tasks/delete_interface.yml", "tasks/assert_device_absent.yml" ] } 28983 1726882990.90548: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882990.90552: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882990.90577: variable 'omit' from source: magic vars 28983 1726882990.90942: variable 'ansible_distribution_major_version' from source: facts 28983 1726882990.90956: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882990.90966: variable 'omit' from source: magic vars 28983 1726882990.90997: variable 'omit' from source: magic vars 28983 1726882990.91060: variable 'item' from source: unknown 28983 1726882990.91146: variable 'item' from source: unknown 28983 1726882990.91167: variable 'omit' from source: magic vars 28983 1726882990.91194: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726882990.91208: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882990.91224: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882990.91245: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726882990.91253: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882990.91324: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882990.91363: Set connection var ansible_connection to ssh 28983 1726882990.91382: Set connection var ansible_shell_executable to /bin/sh 28983 1726882990.91398: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726882990.91414: Set connection var ansible_timeout to 10 28983 1726882990.91427: Set connection var ansible_pipelining to False 28983 1726882990.91442: Set connection var ansible_shell_type to sh 28983 1726882990.91467: variable 'ansible_shell_executable' from source: unknown 28983 1726882990.91478: variable 'ansible_connection' from source: unknown 28983 1726882990.91486: variable 'ansible_module_compression' from source: unknown 28983 1726882990.91493: variable 'ansible_shell_type' from source: unknown 28983 1726882990.91500: variable 'ansible_shell_executable' from source: unknown 28983 1726882990.91507: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882990.91515: variable 'ansible_pipelining' from source: unknown 28983 1726882990.91542: variable 'ansible_timeout' from source: unknown 28983 1726882990.91545: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882990.91651: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726882990.91739: variable 'omit' from source: magic vars 28983 1726882990.91743: starting attempt loop 28983 1726882990.91745: running the handler 28983 1726882990.91747: variable 'lsr_test' from source: include params 28983 1726882990.91793: variable 'lsr_test' from source: include params 28983 1726882990.91818: handler run complete 28983 1726882990.91844: attempt loop complete, returning result 28983 1726882990.91877: variable 'item' from source: unknown 28983 1726882990.91950: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_test) => { "ansible_loop_var": "item", "item": "lsr_test", "lsr_test": [ "tasks/create_bridge_profile_no_autoconnect.yml" ] } 28983 1726882990.92133: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882990.92197: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882990.92200: variable 'omit' from source: magic vars 28983 1726882990.92485: variable 'ansible_distribution_major_version' from source: facts 28983 1726882990.92498: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882990.92508: variable 'omit' from source: magic vars 28983 1726882990.92537: variable 'omit' from source: magic vars 28983 1726882990.92596: variable 'item' from source: unknown 28983 1726882990.92702: variable 'item' from source: unknown 28983 1726882990.92742: variable 'omit' from source: magic vars 28983 1726882990.92757: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726882990.92775: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882990.92852: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882990.92856: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726882990.92858: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882990.92861: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882990.92919: Set connection var ansible_connection to ssh 28983 1726882990.92938: Set connection var ansible_shell_executable to /bin/sh 28983 1726882990.92959: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726882990.92983: Set connection var ansible_timeout to 10 28983 1726882990.92996: Set connection var ansible_pipelining to False 28983 1726882990.93003: Set connection var ansible_shell_type to sh 28983 1726882990.93029: variable 'ansible_shell_executable' from source: unknown 28983 1726882990.93040: variable 'ansible_connection' from source: unknown 28983 1726882990.93069: variable 'ansible_module_compression' from source: unknown 28983 1726882990.93074: variable 'ansible_shell_type' from source: unknown 28983 1726882990.93077: variable 'ansible_shell_executable' from source: unknown 28983 1726882990.93080: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882990.93082: variable 'ansible_pipelining' from source: unknown 28983 1726882990.93140: variable 'ansible_timeout' from source: unknown 28983 1726882990.93143: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882990.93220: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726882990.93237: variable 'omit' from source: magic vars 28983 1726882990.93248: starting attempt loop 28983 1726882990.93256: running the handler 28983 1726882990.93291: variable 'lsr_assert' from source: include params 28983 1726882990.93372: variable 'lsr_assert' from source: include params 28983 1726882990.93408: handler run complete 28983 1726882990.93431: attempt loop complete, returning result 28983 1726882990.93509: variable 'item' from source: unknown 28983 1726882990.93546: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_assert) => { "ansible_loop_var": "item", "item": "lsr_assert", "lsr_assert": [ "tasks/assert_device_absent.yml", "tasks/assert_profile_present.yml" ] } 28983 1726882990.93875: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882990.93880: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882990.93883: variable 'omit' from source: magic vars 28983 1726882990.94023: variable 'ansible_distribution_major_version' from source: facts 28983 1726882990.94037: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882990.94052: variable 'omit' from source: magic vars 28983 1726882990.94077: variable 'omit' from source: magic vars 28983 1726882990.94189: variable 'item' from source: unknown 28983 1726882990.94311: variable 'item' from source: unknown 28983 1726882990.94394: variable 'omit' from source: magic vars 28983 1726882990.94420: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726882990.94439: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882990.94496: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882990.94541: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726882990.94545: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882990.94547: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882990.94885: Set connection var ansible_connection to ssh 28983 1726882990.94994: Set connection var ansible_shell_executable to /bin/sh 28983 1726882990.94998: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726882990.95001: Set connection var ansible_timeout to 10 28983 1726882990.95003: Set connection var ansible_pipelining to False 28983 1726882990.95005: Set connection var ansible_shell_type to sh 28983 1726882990.95008: variable 'ansible_shell_executable' from source: unknown 28983 1726882990.95010: variable 'ansible_connection' from source: unknown 28983 1726882990.95012: variable 'ansible_module_compression' from source: unknown 28983 1726882990.95014: variable 'ansible_shell_type' from source: unknown 28983 1726882990.95016: variable 'ansible_shell_executable' from source: unknown 28983 1726882990.95018: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882990.95020: variable 'ansible_pipelining' from source: unknown 28983 1726882990.95022: variable 'ansible_timeout' from source: unknown 28983 1726882990.95032: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882990.95270: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726882990.95290: variable 'omit' from source: magic vars 28983 1726882990.95327: starting attempt loop 28983 1726882990.95335: running the handler 28983 1726882990.95571: handler run complete 28983 1726882990.95749: attempt loop complete, returning result 28983 1726882990.95753: variable 'item' from source: unknown 28983 1726882990.95868: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_assert_when) => { "ansible_loop_var": "item", "item": "lsr_assert_when", "lsr_assert_when": "VARIABLE IS NOT DEFINED!: 'lsr_assert_when' is undefined" } 28983 1726882990.96305: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882990.96309: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882990.96311: variable 'omit' from source: magic vars 28983 1726882990.96740: variable 'ansible_distribution_major_version' from source: facts 28983 1726882990.96745: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882990.96748: variable 'omit' from source: magic vars 28983 1726882990.96750: variable 'omit' from source: magic vars 28983 1726882990.96752: variable 'item' from source: unknown 28983 1726882990.96919: variable 'item' from source: unknown 28983 1726882990.96945: variable 'omit' from source: magic vars 28983 1726882990.97069: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726882990.97082: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882990.97095: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882990.97112: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726882990.97121: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882990.97130: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882990.97404: Set connection var ansible_connection to ssh 28983 1726882990.97407: Set connection var ansible_shell_executable to /bin/sh 28983 1726882990.97410: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726882990.97412: Set connection var ansible_timeout to 10 28983 1726882990.97414: Set connection var ansible_pipelining to False 28983 1726882990.97416: Set connection var ansible_shell_type to sh 28983 1726882990.97522: variable 'ansible_shell_executable' from source: unknown 28983 1726882990.97529: variable 'ansible_connection' from source: unknown 28983 1726882990.97538: variable 'ansible_module_compression' from source: unknown 28983 1726882990.97544: variable 'ansible_shell_type' from source: unknown 28983 1726882990.97550: variable 'ansible_shell_executable' from source: unknown 28983 1726882990.97555: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882990.97562: variable 'ansible_pipelining' from source: unknown 28983 1726882990.97567: variable 'ansible_timeout' from source: unknown 28983 1726882990.97577: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882990.97839: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726882990.97853: variable 'omit' from source: magic vars 28983 1726882990.97861: starting attempt loop 28983 1726882990.97870: running the handler 28983 1726882990.97895: variable 'lsr_fail_debug' from source: play vars 28983 1726882990.98140: variable 'lsr_fail_debug' from source: play vars 28983 1726882990.98143: handler run complete 28983 1726882990.98196: attempt loop complete, returning result 28983 1726882990.98219: variable 'item' from source: unknown 28983 1726882990.98348: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_fail_debug) => { "ansible_loop_var": "item", "item": "lsr_fail_debug", "lsr_fail_debug": [ "__network_connections_result" ] } 28983 1726882990.98716: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882990.98939: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882990.98942: variable 'omit' from source: magic vars 28983 1726882990.98945: variable 'ansible_distribution_major_version' from source: facts 28983 1726882990.99070: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882990.99083: variable 'omit' from source: magic vars 28983 1726882990.99103: variable 'omit' from source: magic vars 28983 1726882990.99161: variable 'item' from source: unknown 28983 1726882990.99254: variable 'item' from source: unknown 28983 1726882990.99285: variable 'omit' from source: magic vars 28983 1726882990.99333: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726882990.99350: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882990.99362: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882990.99391: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726882990.99399: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882990.99431: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882990.99528: Set connection var ansible_connection to ssh 28983 1726882990.99561: Set connection var ansible_shell_executable to /bin/sh 28983 1726882990.99594: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726882990.99613: Set connection var ansible_timeout to 10 28983 1726882990.99624: Set connection var ansible_pipelining to False 28983 1726882990.99630: Set connection var ansible_shell_type to sh 28983 1726882990.99665: variable 'ansible_shell_executable' from source: unknown 28983 1726882990.99676: variable 'ansible_connection' from source: unknown 28983 1726882990.99685: variable 'ansible_module_compression' from source: unknown 28983 1726882990.99692: variable 'ansible_shell_type' from source: unknown 28983 1726882990.99715: variable 'ansible_shell_executable' from source: unknown 28983 1726882990.99718: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882990.99720: variable 'ansible_pipelining' from source: unknown 28983 1726882990.99723: variable 'ansible_timeout' from source: unknown 28983 1726882990.99732: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882990.99934: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726882990.99939: variable 'omit' from source: magic vars 28983 1726882990.99941: starting attempt loop 28983 1726882990.99944: running the handler 28983 1726882990.99946: variable 'lsr_cleanup' from source: include params 28983 1726882991.00015: variable 'lsr_cleanup' from source: include params 28983 1726882991.00046: handler run complete 28983 1726882991.00069: attempt loop complete, returning result 28983 1726882991.00095: variable 'item' from source: unknown 28983 1726882991.00180: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_cleanup) => { "ansible_loop_var": "item", "item": "lsr_cleanup", "lsr_cleanup": [ "tasks/cleanup_profile+device.yml" ] } 28983 1726882991.00413: dumping result to json 28983 1726882991.00417: done dumping result, returning 28983 1726882991.00420: done running TaskExecutor() for managed_node2/TASK: Show item [0affe814-3a2d-b16d-c0a7-0000000005b5] 28983 1726882991.00423: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000005b5 28983 1726882991.00544: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000005b5 28983 1726882991.00548: WORKER PROCESS EXITING 28983 1726882991.00805: no more pending results, returning what we have 28983 1726882991.00809: results queue empty 28983 1726882991.00810: checking for any_errors_fatal 28983 1726882991.00817: done checking for any_errors_fatal 28983 1726882991.00818: checking for max_fail_percentage 28983 1726882991.00820: done checking for max_fail_percentage 28983 1726882991.00821: checking to see if all hosts have failed and the running result is not ok 28983 1726882991.00822: done checking to see if all hosts have failed 28983 1726882991.00823: getting the remaining hosts for this loop 28983 1726882991.00825: done getting the remaining hosts for this loop 28983 1726882991.00829: getting the next task for host managed_node2 28983 1726882991.00838: done getting next task for host managed_node2 28983 1726882991.00841: ^ task is: TASK: Include the task 'show_interfaces.yml' 28983 1726882991.00845: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882991.00848: getting variables 28983 1726882991.00850: in VariableManager get_vars() 28983 1726882991.00882: Calling all_inventory to load vars for managed_node2 28983 1726882991.00886: Calling groups_inventory to load vars for managed_node2 28983 1726882991.00889: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882991.00900: Calling all_plugins_play to load vars for managed_node2 28983 1726882991.00904: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882991.00908: Calling groups_plugins_play to load vars for managed_node2 28983 1726882991.03384: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882991.07244: done with get_vars() 28983 1726882991.07287: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:21 Friday 20 September 2024 21:43:11 -0400 (0:00:00.216) 0:00:21.071 ****** 28983 1726882991.07403: entering _queue_task() for managed_node2/include_tasks 28983 1726882991.07770: worker is 1 (out of 1 available) 28983 1726882991.07784: exiting _queue_task() for managed_node2/include_tasks 28983 1726882991.07798: done queuing things up, now waiting for results queue to drain 28983 1726882991.07800: waiting for pending results... 28983 1726882991.08114: running TaskExecutor() for managed_node2/TASK: Include the task 'show_interfaces.yml' 28983 1726882991.08339: in run() - task 0affe814-3a2d-b16d-c0a7-0000000005b6 28983 1726882991.08394: variable 'ansible_search_path' from source: unknown 28983 1726882991.08423: variable 'ansible_search_path' from source: unknown 28983 1726882991.08491: calling self._execute() 28983 1726882991.08707: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882991.08711: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882991.08744: variable 'omit' from source: magic vars 28983 1726882991.09366: variable 'ansible_distribution_major_version' from source: facts 28983 1726882991.09386: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882991.09389: _execute() done 28983 1726882991.09395: dumping result to json 28983 1726882991.09398: done dumping result, returning 28983 1726882991.09406: done running TaskExecutor() for managed_node2/TASK: Include the task 'show_interfaces.yml' [0affe814-3a2d-b16d-c0a7-0000000005b6] 28983 1726882991.09414: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000005b6 28983 1726882991.09607: no more pending results, returning what we have 28983 1726882991.09614: in VariableManager get_vars() 28983 1726882991.09661: Calling all_inventory to load vars for managed_node2 28983 1726882991.09669: Calling groups_inventory to load vars for managed_node2 28983 1726882991.09674: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882991.09692: Calling all_plugins_play to load vars for managed_node2 28983 1726882991.09696: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882991.09700: Calling groups_plugins_play to load vars for managed_node2 28983 1726882991.10279: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000005b6 28983 1726882991.10283: WORKER PROCESS EXITING 28983 1726882991.14416: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882991.19011: done with get_vars() 28983 1726882991.19045: variable 'ansible_search_path' from source: unknown 28983 1726882991.19047: variable 'ansible_search_path' from source: unknown 28983 1726882991.19110: we have included files to process 28983 1726882991.19112: generating all_blocks data 28983 1726882991.19114: done generating all_blocks data 28983 1726882991.19120: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 28983 1726882991.19121: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 28983 1726882991.19124: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 28983 1726882991.19377: in VariableManager get_vars() 28983 1726882991.19431: done with get_vars() 28983 1726882991.19687: done processing included file 28983 1726882991.19690: iterating over new_blocks loaded from include file 28983 1726882991.19691: in VariableManager get_vars() 28983 1726882991.19707: done with get_vars() 28983 1726882991.19709: filtering new block on tags 28983 1726882991.19783: done filtering new block on tags 28983 1726882991.19786: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node2 28983 1726882991.19791: extending task lists for all hosts with included blocks 28983 1726882991.20453: done extending task lists 28983 1726882991.20455: done processing included files 28983 1726882991.20456: results queue empty 28983 1726882991.20457: checking for any_errors_fatal 28983 1726882991.20463: done checking for any_errors_fatal 28983 1726882991.20464: checking for max_fail_percentage 28983 1726882991.20466: done checking for max_fail_percentage 28983 1726882991.20467: checking to see if all hosts have failed and the running result is not ok 28983 1726882991.20468: done checking to see if all hosts have failed 28983 1726882991.20469: getting the remaining hosts for this loop 28983 1726882991.20471: done getting the remaining hosts for this loop 28983 1726882991.20474: getting the next task for host managed_node2 28983 1726882991.20479: done getting next task for host managed_node2 28983 1726882991.20488: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 28983 1726882991.20492: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882991.20496: getting variables 28983 1726882991.20497: in VariableManager get_vars() 28983 1726882991.20508: Calling all_inventory to load vars for managed_node2 28983 1726882991.20511: Calling groups_inventory to load vars for managed_node2 28983 1726882991.20514: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882991.20521: Calling all_plugins_play to load vars for managed_node2 28983 1726882991.20524: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882991.20528: Calling groups_plugins_play to load vars for managed_node2 28983 1726882991.23039: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882991.34850: done with get_vars() 28983 1726882991.34887: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 21:43:11 -0400 (0:00:00.276) 0:00:21.347 ****** 28983 1726882991.35043: entering _queue_task() for managed_node2/include_tasks 28983 1726882991.35611: worker is 1 (out of 1 available) 28983 1726882991.35624: exiting _queue_task() for managed_node2/include_tasks 28983 1726882991.35639: done queuing things up, now waiting for results queue to drain 28983 1726882991.35642: waiting for pending results... 28983 1726882991.36536: running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' 28983 1726882991.37093: in run() - task 0affe814-3a2d-b16d-c0a7-0000000005dd 28983 1726882991.37105: variable 'ansible_search_path' from source: unknown 28983 1726882991.37112: variable 'ansible_search_path' from source: unknown 28983 1726882991.37186: calling self._execute() 28983 1726882991.37758: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882991.37767: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882991.37783: variable 'omit' from source: magic vars 28983 1726882991.38947: variable 'ansible_distribution_major_version' from source: facts 28983 1726882991.38951: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882991.38955: _execute() done 28983 1726882991.38957: dumping result to json 28983 1726882991.38961: done dumping result, returning 28983 1726882991.38963: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' [0affe814-3a2d-b16d-c0a7-0000000005dd] 28983 1726882991.38966: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000005dd 28983 1726882991.39042: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000005dd 28983 1726882991.39048: WORKER PROCESS EXITING 28983 1726882991.39087: no more pending results, returning what we have 28983 1726882991.39093: in VariableManager get_vars() 28983 1726882991.39131: Calling all_inventory to load vars for managed_node2 28983 1726882991.39136: Calling groups_inventory to load vars for managed_node2 28983 1726882991.39140: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882991.39156: Calling all_plugins_play to load vars for managed_node2 28983 1726882991.39160: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882991.39164: Calling groups_plugins_play to load vars for managed_node2 28983 1726882991.42799: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882991.46098: done with get_vars() 28983 1726882991.46133: variable 'ansible_search_path' from source: unknown 28983 1726882991.46137: variable 'ansible_search_path' from source: unknown 28983 1726882991.46179: we have included files to process 28983 1726882991.46180: generating all_blocks data 28983 1726882991.46182: done generating all_blocks data 28983 1726882991.46184: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 28983 1726882991.46186: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 28983 1726882991.46188: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 28983 1726882991.46617: done processing included file 28983 1726882991.46619: iterating over new_blocks loaded from include file 28983 1726882991.46623: in VariableManager get_vars() 28983 1726882991.46645: done with get_vars() 28983 1726882991.46647: filtering new block on tags 28983 1726882991.46701: done filtering new block on tags 28983 1726882991.46705: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node2 28983 1726882991.46712: extending task lists for all hosts with included blocks 28983 1726882991.47021: done extending task lists 28983 1726882991.47022: done processing included files 28983 1726882991.47023: results queue empty 28983 1726882991.47024: checking for any_errors_fatal 28983 1726882991.47029: done checking for any_errors_fatal 28983 1726882991.47030: checking for max_fail_percentage 28983 1726882991.47032: done checking for max_fail_percentage 28983 1726882991.47033: checking to see if all hosts have failed and the running result is not ok 28983 1726882991.47036: done checking to see if all hosts have failed 28983 1726882991.47037: getting the remaining hosts for this loop 28983 1726882991.47039: done getting the remaining hosts for this loop 28983 1726882991.47042: getting the next task for host managed_node2 28983 1726882991.47048: done getting next task for host managed_node2 28983 1726882991.47050: ^ task is: TASK: Gather current interface info 28983 1726882991.47070: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882991.47082: getting variables 28983 1726882991.47084: in VariableManager get_vars() 28983 1726882991.47096: Calling all_inventory to load vars for managed_node2 28983 1726882991.47099: Calling groups_inventory to load vars for managed_node2 28983 1726882991.47102: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882991.47109: Calling all_plugins_play to load vars for managed_node2 28983 1726882991.47112: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882991.47116: Calling groups_plugins_play to load vars for managed_node2 28983 1726882991.49455: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882991.52526: done with get_vars() 28983 1726882991.52560: done getting variables 28983 1726882991.52613: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 21:43:11 -0400 (0:00:00.176) 0:00:21.524 ****** 28983 1726882991.52655: entering _queue_task() for managed_node2/command 28983 1726882991.53025: worker is 1 (out of 1 available) 28983 1726882991.53040: exiting _queue_task() for managed_node2/command 28983 1726882991.53054: done queuing things up, now waiting for results queue to drain 28983 1726882991.53055: waiting for pending results... 28983 1726882991.53457: running TaskExecutor() for managed_node2/TASK: Gather current interface info 28983 1726882991.53491: in run() - task 0affe814-3a2d-b16d-c0a7-000000000618 28983 1726882991.53507: variable 'ansible_search_path' from source: unknown 28983 1726882991.53510: variable 'ansible_search_path' from source: unknown 28983 1726882991.53554: calling self._execute() 28983 1726882991.53661: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882991.53667: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882991.53684: variable 'omit' from source: magic vars 28983 1726882991.54108: variable 'ansible_distribution_major_version' from source: facts 28983 1726882991.54118: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882991.54125: variable 'omit' from source: magic vars 28983 1726882991.54198: variable 'omit' from source: magic vars 28983 1726882991.54261: variable 'omit' from source: magic vars 28983 1726882991.54323: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726882991.54430: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726882991.54441: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726882991.54450: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882991.54468: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882991.54511: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726882991.54521: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882991.54538: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882991.54690: Set connection var ansible_connection to ssh 28983 1726882991.54710: Set connection var ansible_shell_executable to /bin/sh 28983 1726882991.54727: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726882991.54763: Set connection var ansible_timeout to 10 28983 1726882991.54772: Set connection var ansible_pipelining to False 28983 1726882991.54880: Set connection var ansible_shell_type to sh 28983 1726882991.54885: variable 'ansible_shell_executable' from source: unknown 28983 1726882991.54889: variable 'ansible_connection' from source: unknown 28983 1726882991.54892: variable 'ansible_module_compression' from source: unknown 28983 1726882991.54894: variable 'ansible_shell_type' from source: unknown 28983 1726882991.54897: variable 'ansible_shell_executable' from source: unknown 28983 1726882991.54899: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882991.54901: variable 'ansible_pipelining' from source: unknown 28983 1726882991.54904: variable 'ansible_timeout' from source: unknown 28983 1726882991.54906: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882991.55063: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726882991.55098: variable 'omit' from source: magic vars 28983 1726882991.55102: starting attempt loop 28983 1726882991.55134: running the handler 28983 1726882991.55140: _low_level_execute_command(): starting 28983 1726882991.55152: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726882991.56248: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726882991.56263: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882991.56377: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882991.58188: stdout chunk (state=3): >>>/root <<< 28983 1726882991.58362: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882991.58397: stderr chunk (state=3): >>><<< 28983 1726882991.58415: stdout chunk (state=3): >>><<< 28983 1726882991.58449: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726882991.58576: _low_level_execute_command(): starting 28983 1726882991.58581: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882991.5845585-29821-167863907307340 `" && echo ansible-tmp-1726882991.5845585-29821-167863907307340="` echo /root/.ansible/tmp/ansible-tmp-1726882991.5845585-29821-167863907307340 `" ) && sleep 0' 28983 1726882991.59144: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726882991.59158: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726882991.59177: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882991.59198: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726882991.59215: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726882991.59267: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882991.59358: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726882991.59402: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882991.59486: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882991.61557: stdout chunk (state=3): >>>ansible-tmp-1726882991.5845585-29821-167863907307340=/root/.ansible/tmp/ansible-tmp-1726882991.5845585-29821-167863907307340 <<< 28983 1726882991.61746: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882991.61750: stdout chunk (state=3): >>><<< 28983 1726882991.61752: stderr chunk (state=3): >>><<< 28983 1726882991.61762: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882991.5845585-29821-167863907307340=/root/.ansible/tmp/ansible-tmp-1726882991.5845585-29821-167863907307340 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726882991.61841: variable 'ansible_module_compression' from source: unknown 28983 1726882991.61851: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 28983 1726882991.61897: variable 'ansible_facts' from source: unknown 28983 1726882991.61973: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882991.5845585-29821-167863907307340/AnsiballZ_command.py 28983 1726882991.62207: Sending initial data 28983 1726882991.62211: Sent initial data (156 bytes) 28983 1726882991.62731: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726882991.62738: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726882991.62741: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882991.62860: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726882991.62864: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882991.62965: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882991.64643: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 28983 1726882991.64664: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 28983 1726882991.64681: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 28983 1726882991.64709: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726882991.64804: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726882991.64876: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmplwfui5ht /root/.ansible/tmp/ansible-tmp-1726882991.5845585-29821-167863907307340/AnsiballZ_command.py <<< 28983 1726882991.64895: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882991.5845585-29821-167863907307340/AnsiballZ_command.py" <<< 28983 1726882991.64951: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmplwfui5ht" to remote "/root/.ansible/tmp/ansible-tmp-1726882991.5845585-29821-167863907307340/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882991.5845585-29821-167863907307340/AnsiballZ_command.py" <<< 28983 1726882991.66558: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882991.66598: stderr chunk (state=3): >>><<< 28983 1726882991.66609: stdout chunk (state=3): >>><<< 28983 1726882991.66639: done transferring module to remote 28983 1726882991.66662: _low_level_execute_command(): starting 28983 1726882991.66670: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882991.5845585-29821-167863907307340/ /root/.ansible/tmp/ansible-tmp-1726882991.5845585-29821-167863907307340/AnsiballZ_command.py && sleep 0' 28983 1726882991.67268: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726882991.67278: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726882991.67290: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882991.67306: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726882991.67319: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726882991.67327: stderr chunk (state=3): >>>debug2: match not found <<< 28983 1726882991.67339: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882991.67354: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28983 1726882991.67387: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.46.139 is address <<< 28983 1726882991.67391: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28983 1726882991.67393: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726882991.67396: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882991.67400: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726882991.67425: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726882991.67432: stderr chunk (state=3): >>>debug2: match found <<< 28983 1726882991.67436: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882991.67525: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726882991.67529: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726882991.67556: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882991.67659: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882991.69704: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882991.69707: stdout chunk (state=3): >>><<< 28983 1726882991.69710: stderr chunk (state=3): >>><<< 28983 1726882991.69814: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726882991.69819: _low_level_execute_command(): starting 28983 1726882991.69823: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882991.5845585-29821-167863907307340/AnsiballZ_command.py && sleep 0' 28983 1726882991.70457: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 28983 1726882991.70473: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726882991.70498: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882991.70650: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882991.88257: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:43:11.877831", "end": "2024-09-20 21:43:11.881406", "delta": "0:00:00.003575", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 28983 1726882991.90198: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. <<< 28983 1726882991.90202: stdout chunk (state=3): >>><<< 28983 1726882991.90205: stderr chunk (state=3): >>><<< 28983 1726882991.90208: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:43:11.877831", "end": "2024-09-20 21:43:11.881406", "delta": "0:00:00.003575", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726882991.90265: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882991.5845585-29821-167863907307340/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726882991.90392: _low_level_execute_command(): starting 28983 1726882991.90395: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882991.5845585-29821-167863907307340/ > /dev/null 2>&1 && sleep 0' 28983 1726882991.91206: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726882991.91224: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726882991.91244: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882991.91277: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726882991.91392: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 28983 1726882991.91503: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882991.91651: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882991.93556: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882991.93592: stderr chunk (state=3): >>><<< 28983 1726882991.93600: stdout chunk (state=3): >>><<< 28983 1726882991.93615: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726882991.93622: handler run complete 28983 1726882991.93647: Evaluated conditional (False): False 28983 1726882991.93657: attempt loop complete, returning result 28983 1726882991.93660: _execute() done 28983 1726882991.93664: dumping result to json 28983 1726882991.93673: done dumping result, returning 28983 1726882991.93683: done running TaskExecutor() for managed_node2/TASK: Gather current interface info [0affe814-3a2d-b16d-c0a7-000000000618] 28983 1726882991.93689: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000618 28983 1726882991.93797: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000618 28983 1726882991.93799: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003575", "end": "2024-09-20 21:43:11.881406", "rc": 0, "start": "2024-09-20 21:43:11.877831" } STDOUT: bonding_masters eth0 lo 28983 1726882991.93890: no more pending results, returning what we have 28983 1726882991.93894: results queue empty 28983 1726882991.93895: checking for any_errors_fatal 28983 1726882991.93897: done checking for any_errors_fatal 28983 1726882991.93898: checking for max_fail_percentage 28983 1726882991.93900: done checking for max_fail_percentage 28983 1726882991.93901: checking to see if all hosts have failed and the running result is not ok 28983 1726882991.93902: done checking to see if all hosts have failed 28983 1726882991.93903: getting the remaining hosts for this loop 28983 1726882991.93906: done getting the remaining hosts for this loop 28983 1726882991.93910: getting the next task for host managed_node2 28983 1726882991.93919: done getting next task for host managed_node2 28983 1726882991.93921: ^ task is: TASK: Set current_interfaces 28983 1726882991.93927: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882991.93931: getting variables 28983 1726882991.93933: in VariableManager get_vars() 28983 1726882991.93968: Calling all_inventory to load vars for managed_node2 28983 1726882991.93972: Calling groups_inventory to load vars for managed_node2 28983 1726882991.93975: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882991.93986: Calling all_plugins_play to load vars for managed_node2 28983 1726882991.93989: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882991.93992: Calling groups_plugins_play to load vars for managed_node2 28983 1726882991.96558: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882991.99854: done with get_vars() 28983 1726882991.99897: done getting variables 28983 1726882991.99978: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 21:43:11 -0400 (0:00:00.473) 0:00:21.997 ****** 28983 1726882992.00021: entering _queue_task() for managed_node2/set_fact 28983 1726882992.00405: worker is 1 (out of 1 available) 28983 1726882992.00417: exiting _queue_task() for managed_node2/set_fact 28983 1726882992.00429: done queuing things up, now waiting for results queue to drain 28983 1726882992.00431: waiting for pending results... 28983 1726882992.00853: running TaskExecutor() for managed_node2/TASK: Set current_interfaces 28983 1726882992.00924: in run() - task 0affe814-3a2d-b16d-c0a7-000000000619 28983 1726882992.00951: variable 'ansible_search_path' from source: unknown 28983 1726882992.00962: variable 'ansible_search_path' from source: unknown 28983 1726882992.01006: calling self._execute() 28983 1726882992.01116: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882992.01130: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882992.01151: variable 'omit' from source: magic vars 28983 1726882992.01589: variable 'ansible_distribution_major_version' from source: facts 28983 1726882992.01610: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882992.01839: variable 'omit' from source: magic vars 28983 1726882992.01843: variable 'omit' from source: magic vars 28983 1726882992.01847: variable '_current_interfaces' from source: set_fact 28983 1726882992.01902: variable 'omit' from source: magic vars 28983 1726882992.01956: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726882992.02002: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726882992.02031: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726882992.02108: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882992.02125: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882992.02199: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726882992.02210: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882992.02219: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882992.02344: Set connection var ansible_connection to ssh 28983 1726882992.02363: Set connection var ansible_shell_executable to /bin/sh 28983 1726882992.02379: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726882992.02396: Set connection var ansible_timeout to 10 28983 1726882992.02412: Set connection var ansible_pipelining to False 28983 1726882992.02420: Set connection var ansible_shell_type to sh 28983 1726882992.02454: variable 'ansible_shell_executable' from source: unknown 28983 1726882992.02464: variable 'ansible_connection' from source: unknown 28983 1726882992.02474: variable 'ansible_module_compression' from source: unknown 28983 1726882992.02483: variable 'ansible_shell_type' from source: unknown 28983 1726882992.02492: variable 'ansible_shell_executable' from source: unknown 28983 1726882992.02500: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882992.02507: variable 'ansible_pipelining' from source: unknown 28983 1726882992.02513: variable 'ansible_timeout' from source: unknown 28983 1726882992.02521: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882992.02672: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726882992.02690: variable 'omit' from source: magic vars 28983 1726882992.02701: starting attempt loop 28983 1726882992.02710: running the handler 28983 1726882992.02843: handler run complete 28983 1726882992.02859: attempt loop complete, returning result 28983 1726882992.02862: _execute() done 28983 1726882992.02865: dumping result to json 28983 1726882992.02872: done dumping result, returning 28983 1726882992.02884: done running TaskExecutor() for managed_node2/TASK: Set current_interfaces [0affe814-3a2d-b16d-c0a7-000000000619] 28983 1726882992.02890: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000619 28983 1726882992.02995: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000619 28983 1726882992.02999: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 28983 1726882992.03082: no more pending results, returning what we have 28983 1726882992.03087: results queue empty 28983 1726882992.03088: checking for any_errors_fatal 28983 1726882992.03098: done checking for any_errors_fatal 28983 1726882992.03099: checking for max_fail_percentage 28983 1726882992.03102: done checking for max_fail_percentage 28983 1726882992.03103: checking to see if all hosts have failed and the running result is not ok 28983 1726882992.03104: done checking to see if all hosts have failed 28983 1726882992.03105: getting the remaining hosts for this loop 28983 1726882992.03107: done getting the remaining hosts for this loop 28983 1726882992.03113: getting the next task for host managed_node2 28983 1726882992.03124: done getting next task for host managed_node2 28983 1726882992.03127: ^ task is: TASK: Show current_interfaces 28983 1726882992.03133: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882992.03140: getting variables 28983 1726882992.03142: in VariableManager get_vars() 28983 1726882992.03183: Calling all_inventory to load vars for managed_node2 28983 1726882992.03187: Calling groups_inventory to load vars for managed_node2 28983 1726882992.03191: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882992.03204: Calling all_plugins_play to load vars for managed_node2 28983 1726882992.03208: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882992.03213: Calling groups_plugins_play to load vars for managed_node2 28983 1726882992.05832: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882992.08956: done with get_vars() 28983 1726882992.09004: done getting variables 28983 1726882992.09084: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 21:43:12 -0400 (0:00:00.091) 0:00:22.089 ****** 28983 1726882992.09127: entering _queue_task() for managed_node2/debug 28983 1726882992.09580: worker is 1 (out of 1 available) 28983 1726882992.09593: exiting _queue_task() for managed_node2/debug 28983 1726882992.09605: done queuing things up, now waiting for results queue to drain 28983 1726882992.09607: waiting for pending results... 28983 1726882992.09882: running TaskExecutor() for managed_node2/TASK: Show current_interfaces 28983 1726882992.10141: in run() - task 0affe814-3a2d-b16d-c0a7-0000000005de 28983 1726882992.10146: variable 'ansible_search_path' from source: unknown 28983 1726882992.10149: variable 'ansible_search_path' from source: unknown 28983 1726882992.10152: calling self._execute() 28983 1726882992.10165: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882992.10174: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882992.10192: variable 'omit' from source: magic vars 28983 1726882992.10659: variable 'ansible_distribution_major_version' from source: facts 28983 1726882992.10673: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882992.10683: variable 'omit' from source: magic vars 28983 1726882992.10750: variable 'omit' from source: magic vars 28983 1726882992.10875: variable 'current_interfaces' from source: set_fact 28983 1726882992.10909: variable 'omit' from source: magic vars 28983 1726882992.10964: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726882992.11010: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726882992.11043: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726882992.11066: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882992.11083: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882992.11147: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726882992.11150: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882992.11153: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882992.11261: Set connection var ansible_connection to ssh 28983 1726882992.11279: Set connection var ansible_shell_executable to /bin/sh 28983 1726882992.11292: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726882992.11321: Set connection var ansible_timeout to 10 28983 1726882992.11324: Set connection var ansible_pipelining to False 28983 1726882992.11327: Set connection var ansible_shell_type to sh 28983 1726882992.11364: variable 'ansible_shell_executable' from source: unknown 28983 1726882992.11367: variable 'ansible_connection' from source: unknown 28983 1726882992.11370: variable 'ansible_module_compression' from source: unknown 28983 1726882992.11373: variable 'ansible_shell_type' from source: unknown 28983 1726882992.11375: variable 'ansible_shell_executable' from source: unknown 28983 1726882992.11378: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882992.11380: variable 'ansible_pipelining' from source: unknown 28983 1726882992.11382: variable 'ansible_timeout' from source: unknown 28983 1726882992.11385: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882992.11568: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726882992.11577: variable 'omit' from source: magic vars 28983 1726882992.11586: starting attempt loop 28983 1726882992.11591: running the handler 28983 1726882992.11648: handler run complete 28983 1726882992.11665: attempt loop complete, returning result 28983 1726882992.11669: _execute() done 28983 1726882992.11672: dumping result to json 28983 1726882992.11706: done dumping result, returning 28983 1726882992.11710: done running TaskExecutor() for managed_node2/TASK: Show current_interfaces [0affe814-3a2d-b16d-c0a7-0000000005de] 28983 1726882992.11713: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000005de 28983 1726882992.11862: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000005de 28983 1726882992.11867: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 28983 1726882992.11964: no more pending results, returning what we have 28983 1726882992.11968: results queue empty 28983 1726882992.11969: checking for any_errors_fatal 28983 1726882992.11981: done checking for any_errors_fatal 28983 1726882992.11982: checking for max_fail_percentage 28983 1726882992.11984: done checking for max_fail_percentage 28983 1726882992.11986: checking to see if all hosts have failed and the running result is not ok 28983 1726882992.11987: done checking to see if all hosts have failed 28983 1726882992.11988: getting the remaining hosts for this loop 28983 1726882992.11990: done getting the remaining hosts for this loop 28983 1726882992.11995: getting the next task for host managed_node2 28983 1726882992.12006: done getting next task for host managed_node2 28983 1726882992.12011: ^ task is: TASK: Setup 28983 1726882992.12015: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882992.12019: getting variables 28983 1726882992.12020: in VariableManager get_vars() 28983 1726882992.12057: Calling all_inventory to load vars for managed_node2 28983 1726882992.12060: Calling groups_inventory to load vars for managed_node2 28983 1726882992.12065: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882992.12079: Calling all_plugins_play to load vars for managed_node2 28983 1726882992.12084: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882992.12089: Calling groups_plugins_play to load vars for managed_node2 28983 1726882992.14041: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882992.15953: done with get_vars() 28983 1726882992.15986: done getting variables TASK [Setup] ******************************************************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:24 Friday 20 September 2024 21:43:12 -0400 (0:00:00.069) 0:00:22.158 ****** 28983 1726882992.16093: entering _queue_task() for managed_node2/include_tasks 28983 1726882992.16393: worker is 1 (out of 1 available) 28983 1726882992.16412: exiting _queue_task() for managed_node2/include_tasks 28983 1726882992.16427: done queuing things up, now waiting for results queue to drain 28983 1726882992.16430: waiting for pending results... 28983 1726882992.16662: running TaskExecutor() for managed_node2/TASK: Setup 28983 1726882992.16739: in run() - task 0affe814-3a2d-b16d-c0a7-0000000005b7 28983 1726882992.16752: variable 'ansible_search_path' from source: unknown 28983 1726882992.16756: variable 'ansible_search_path' from source: unknown 28983 1726882992.16797: variable 'lsr_setup' from source: include params 28983 1726882992.16972: variable 'lsr_setup' from source: include params 28983 1726882992.17030: variable 'omit' from source: magic vars 28983 1726882992.17144: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882992.17153: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882992.17165: variable 'omit' from source: magic vars 28983 1726882992.17374: variable 'ansible_distribution_major_version' from source: facts 28983 1726882992.17387: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882992.17396: variable 'item' from source: unknown 28983 1726882992.17449: variable 'item' from source: unknown 28983 1726882992.17479: variable 'item' from source: unknown 28983 1726882992.17534: variable 'item' from source: unknown 28983 1726882992.17663: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882992.17667: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882992.17669: variable 'omit' from source: magic vars 28983 1726882992.17790: variable 'ansible_distribution_major_version' from source: facts 28983 1726882992.17794: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882992.17801: variable 'item' from source: unknown 28983 1726882992.17853: variable 'item' from source: unknown 28983 1726882992.17879: variable 'item' from source: unknown 28983 1726882992.17932: variable 'item' from source: unknown 28983 1726882992.18010: dumping result to json 28983 1726882992.18014: done dumping result, returning 28983 1726882992.18017: done running TaskExecutor() for managed_node2/TASK: Setup [0affe814-3a2d-b16d-c0a7-0000000005b7] 28983 1726882992.18019: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000005b7 28983 1726882992.18061: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000005b7 28983 1726882992.18064: WORKER PROCESS EXITING 28983 1726882992.18150: no more pending results, returning what we have 28983 1726882992.18155: in VariableManager get_vars() 28983 1726882992.18185: Calling all_inventory to load vars for managed_node2 28983 1726882992.18188: Calling groups_inventory to load vars for managed_node2 28983 1726882992.18191: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882992.18201: Calling all_plugins_play to load vars for managed_node2 28983 1726882992.18204: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882992.18208: Calling groups_plugins_play to load vars for managed_node2 28983 1726882992.20220: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882992.21855: done with get_vars() 28983 1726882992.21880: variable 'ansible_search_path' from source: unknown 28983 1726882992.21881: variable 'ansible_search_path' from source: unknown 28983 1726882992.21913: variable 'ansible_search_path' from source: unknown 28983 1726882992.21913: variable 'ansible_search_path' from source: unknown 28983 1726882992.21937: we have included files to process 28983 1726882992.21938: generating all_blocks data 28983 1726882992.21939: done generating all_blocks data 28983 1726882992.21943: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 28983 1726882992.21944: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 28983 1726882992.21945: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 28983 1726882992.22112: done processing included file 28983 1726882992.22114: iterating over new_blocks loaded from include file 28983 1726882992.22115: in VariableManager get_vars() 28983 1726882992.22126: done with get_vars() 28983 1726882992.22127: filtering new block on tags 28983 1726882992.22147: done filtering new block on tags 28983 1726882992.22149: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml for managed_node2 => (item=tasks/delete_interface.yml) 28983 1726882992.22153: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 28983 1726882992.22154: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 28983 1726882992.22156: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 28983 1726882992.22244: in VariableManager get_vars() 28983 1726882992.22259: done with get_vars() 28983 1726882992.22358: done processing included file 28983 1726882992.22360: iterating over new_blocks loaded from include file 28983 1726882992.22361: in VariableManager get_vars() 28983 1726882992.22371: done with get_vars() 28983 1726882992.22375: filtering new block on tags 28983 1726882992.22400: done filtering new block on tags 28983 1726882992.22401: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed_node2 => (item=tasks/assert_device_absent.yml) 28983 1726882992.22404: extending task lists for all hosts with included blocks 28983 1726882992.23204: done extending task lists 28983 1726882992.23206: done processing included files 28983 1726882992.23207: results queue empty 28983 1726882992.23208: checking for any_errors_fatal 28983 1726882992.23211: done checking for any_errors_fatal 28983 1726882992.23212: checking for max_fail_percentage 28983 1726882992.23213: done checking for max_fail_percentage 28983 1726882992.23214: checking to see if all hosts have failed and the running result is not ok 28983 1726882992.23215: done checking to see if all hosts have failed 28983 1726882992.23216: getting the remaining hosts for this loop 28983 1726882992.23218: done getting the remaining hosts for this loop 28983 1726882992.23221: getting the next task for host managed_node2 28983 1726882992.23226: done getting next task for host managed_node2 28983 1726882992.23228: ^ task is: TASK: Remove test interface if necessary 28983 1726882992.23231: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882992.23236: getting variables 28983 1726882992.23237: in VariableManager get_vars() 28983 1726882992.23252: Calling all_inventory to load vars for managed_node2 28983 1726882992.23255: Calling groups_inventory to load vars for managed_node2 28983 1726882992.23258: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882992.23266: Calling all_plugins_play to load vars for managed_node2 28983 1726882992.23269: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882992.23275: Calling groups_plugins_play to load vars for managed_node2 28983 1726882992.25023: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882992.27178: done with get_vars() 28983 1726882992.27199: done getting variables 28983 1726882992.27237: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Remove test interface if necessary] ************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml:3 Friday 20 September 2024 21:43:12 -0400 (0:00:00.111) 0:00:22.270 ****** 28983 1726882992.27262: entering _queue_task() for managed_node2/command 28983 1726882992.27517: worker is 1 (out of 1 available) 28983 1726882992.27531: exiting _queue_task() for managed_node2/command 28983 1726882992.27546: done queuing things up, now waiting for results queue to drain 28983 1726882992.27548: waiting for pending results... 28983 1726882992.27744: running TaskExecutor() for managed_node2/TASK: Remove test interface if necessary 28983 1726882992.27831: in run() - task 0affe814-3a2d-b16d-c0a7-00000000063e 28983 1726882992.27852: variable 'ansible_search_path' from source: unknown 28983 1726882992.27857: variable 'ansible_search_path' from source: unknown 28983 1726882992.27891: calling self._execute() 28983 1726882992.27968: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882992.27975: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882992.27985: variable 'omit' from source: magic vars 28983 1726882992.28394: variable 'ansible_distribution_major_version' from source: facts 28983 1726882992.28398: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882992.28401: variable 'omit' from source: magic vars 28983 1726882992.28533: variable 'omit' from source: magic vars 28983 1726882992.28739: variable 'interface' from source: play vars 28983 1726882992.28743: variable 'omit' from source: magic vars 28983 1726882992.28746: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726882992.28749: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726882992.28752: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726882992.28755: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882992.28757: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882992.28784: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726882992.28787: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882992.28790: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882992.28897: Set connection var ansible_connection to ssh 28983 1726882992.28911: Set connection var ansible_shell_executable to /bin/sh 28983 1726882992.28922: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726882992.28933: Set connection var ansible_timeout to 10 28983 1726882992.28943: Set connection var ansible_pipelining to False 28983 1726882992.28946: Set connection var ansible_shell_type to sh 28983 1726882992.28971: variable 'ansible_shell_executable' from source: unknown 28983 1726882992.28984: variable 'ansible_connection' from source: unknown 28983 1726882992.28993: variable 'ansible_module_compression' from source: unknown 28983 1726882992.28995: variable 'ansible_shell_type' from source: unknown 28983 1726882992.28998: variable 'ansible_shell_executable' from source: unknown 28983 1726882992.29000: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882992.29002: variable 'ansible_pipelining' from source: unknown 28983 1726882992.29005: variable 'ansible_timeout' from source: unknown 28983 1726882992.29007: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882992.29206: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726882992.29210: variable 'omit' from source: magic vars 28983 1726882992.29213: starting attempt loop 28983 1726882992.29215: running the handler 28983 1726882992.29218: _low_level_execute_command(): starting 28983 1726882992.29220: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726882992.29811: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882992.29816: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882992.29820: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882992.29878: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726882992.29885: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882992.29997: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882992.31728: stdout chunk (state=3): >>>/root <<< 28983 1726882992.31855: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882992.31903: stderr chunk (state=3): >>><<< 28983 1726882992.31907: stdout chunk (state=3): >>><<< 28983 1726882992.31933: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726882992.31954: _low_level_execute_command(): starting 28983 1726882992.31957: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882992.319298-29849-154632434056707 `" && echo ansible-tmp-1726882992.319298-29849-154632434056707="` echo /root/.ansible/tmp/ansible-tmp-1726882992.319298-29849-154632434056707 `" ) && sleep 0' 28983 1726882992.32642: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882992.32646: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882992.32650: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882992.32661: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882992.32796: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882992.32856: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882992.35240: stdout chunk (state=3): >>>ansible-tmp-1726882992.319298-29849-154632434056707=/root/.ansible/tmp/ansible-tmp-1726882992.319298-29849-154632434056707 <<< 28983 1726882992.35244: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882992.35250: stdout chunk (state=3): >>><<< 28983 1726882992.35254: stderr chunk (state=3): >>><<< 28983 1726882992.35257: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882992.319298-29849-154632434056707=/root/.ansible/tmp/ansible-tmp-1726882992.319298-29849-154632434056707 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726882992.35261: variable 'ansible_module_compression' from source: unknown 28983 1726882992.35264: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 28983 1726882992.35270: variable 'ansible_facts' from source: unknown 28983 1726882992.35364: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882992.319298-29849-154632434056707/AnsiballZ_command.py 28983 1726882992.35562: Sending initial data 28983 1726882992.35575: Sent initial data (155 bytes) 28983 1726882992.36237: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726882992.36365: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726882992.36420: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882992.36564: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882992.38177: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726882992.38261: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726882992.38333: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpqt2rorcv /root/.ansible/tmp/ansible-tmp-1726882992.319298-29849-154632434056707/AnsiballZ_command.py <<< 28983 1726882992.38342: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882992.319298-29849-154632434056707/AnsiballZ_command.py" <<< 28983 1726882992.38395: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpqt2rorcv" to remote "/root/.ansible/tmp/ansible-tmp-1726882992.319298-29849-154632434056707/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882992.319298-29849-154632434056707/AnsiballZ_command.py" <<< 28983 1726882992.39843: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882992.39847: stderr chunk (state=3): >>><<< 28983 1726882992.39850: stdout chunk (state=3): >>><<< 28983 1726882992.39852: done transferring module to remote 28983 1726882992.39854: _low_level_execute_command(): starting 28983 1726882992.39857: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882992.319298-29849-154632434056707/ /root/.ansible/tmp/ansible-tmp-1726882992.319298-29849-154632434056707/AnsiballZ_command.py && sleep 0' 28983 1726882992.40520: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726882992.40536: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726882992.40550: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882992.40620: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726882992.40624: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726882992.40626: stderr chunk (state=3): >>>debug2: match not found <<< 28983 1726882992.40629: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882992.40643: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28983 1726882992.40717: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882992.40757: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726882992.40769: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726882992.40826: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882992.41042: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882992.42979: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882992.43085: stderr chunk (state=3): >>><<< 28983 1726882992.43125: stdout chunk (state=3): >>><<< 28983 1726882992.43152: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726882992.43165: _low_level_execute_command(): starting 28983 1726882992.43193: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882992.319298-29849-154632434056707/AnsiballZ_command.py && sleep 0' 28983 1726882992.43938: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726882992.43952: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726882992.43966: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882992.44002: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726882992.44014: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882992.44118: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882992.44143: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726882992.44165: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726882992.44188: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882992.44360: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882992.62245: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "Cannot find device \"statebr\"", "rc": 1, "cmd": ["ip", "link", "del", "statebr"], "start": "2024-09-20 21:43:12.614836", "end": "2024-09-20 21:43:12.621322", "delta": "0:00:00.006486", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del statebr", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 28983 1726882992.63906: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.46.139 closed. <<< 28983 1726882992.63967: stdout chunk (state=3): >>><<< 28983 1726882992.63971: stderr chunk (state=3): >>><<< 28983 1726882992.64148: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "Cannot find device \"statebr\"", "rc": 1, "cmd": ["ip", "link", "del", "statebr"], "start": "2024-09-20 21:43:12.614836", "end": "2024-09-20 21:43:12.621322", "delta": "0:00:00.006486", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del statebr", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.46.139 closed. 28983 1726882992.64153: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882992.319298-29849-154632434056707/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726882992.64156: _low_level_execute_command(): starting 28983 1726882992.64158: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882992.319298-29849-154632434056707/ > /dev/null 2>&1 && sleep 0' 28983 1726882992.65093: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726882992.65112: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726882992.65163: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882992.65255: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882992.65300: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726882992.65354: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726882992.65438: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882992.65573: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882992.67640: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882992.67676: stdout chunk (state=3): >>><<< 28983 1726882992.67692: stderr chunk (state=3): >>><<< 28983 1726882992.67711: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726882992.67723: handler run complete 28983 1726882992.67758: Evaluated conditional (False): False 28983 1726882992.67782: attempt loop complete, returning result 28983 1726882992.67793: _execute() done 28983 1726882992.67801: dumping result to json 28983 1726882992.67811: done dumping result, returning 28983 1726882992.67822: done running TaskExecutor() for managed_node2/TASK: Remove test interface if necessary [0affe814-3a2d-b16d-c0a7-00000000063e] 28983 1726882992.67830: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000063e 28983 1726882992.68261: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000063e 28983 1726882992.68267: WORKER PROCESS EXITING fatal: [managed_node2]: FAILED! => { "changed": false, "cmd": [ "ip", "link", "del", "statebr" ], "delta": "0:00:00.006486", "end": "2024-09-20 21:43:12.621322", "rc": 1, "start": "2024-09-20 21:43:12.614836" } STDERR: Cannot find device "statebr" MSG: non-zero return code ...ignoring 28983 1726882992.68392: no more pending results, returning what we have 28983 1726882992.68397: results queue empty 28983 1726882992.68398: checking for any_errors_fatal 28983 1726882992.68401: done checking for any_errors_fatal 28983 1726882992.68402: checking for max_fail_percentage 28983 1726882992.68404: done checking for max_fail_percentage 28983 1726882992.68407: checking to see if all hosts have failed and the running result is not ok 28983 1726882992.68407: done checking to see if all hosts have failed 28983 1726882992.68408: getting the remaining hosts for this loop 28983 1726882992.68413: done getting the remaining hosts for this loop 28983 1726882992.68417: getting the next task for host managed_node2 28983 1726882992.68432: done getting next task for host managed_node2 28983 1726882992.68438: ^ task is: TASK: Include the task 'get_interface_stat.yml' 28983 1726882992.68445: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882992.68452: getting variables 28983 1726882992.68454: in VariableManager get_vars() 28983 1726882992.68503: Calling all_inventory to load vars for managed_node2 28983 1726882992.68509: Calling groups_inventory to load vars for managed_node2 28983 1726882992.68517: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882992.68673: Calling all_plugins_play to load vars for managed_node2 28983 1726882992.68680: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882992.68690: Calling groups_plugins_play to load vars for managed_node2 28983 1726882992.73351: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882992.77064: done with get_vars() 28983 1726882992.77102: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Friday 20 September 2024 21:43:12 -0400 (0:00:00.500) 0:00:22.771 ****** 28983 1726882992.77325: entering _queue_task() for managed_node2/include_tasks 28983 1726882992.78086: worker is 1 (out of 1 available) 28983 1726882992.78101: exiting _queue_task() for managed_node2/include_tasks 28983 1726882992.78115: done queuing things up, now waiting for results queue to drain 28983 1726882992.78117: waiting for pending results... 28983 1726882992.78822: running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' 28983 1726882992.78875: in run() - task 0affe814-3a2d-b16d-c0a7-000000000642 28983 1726882992.78895: variable 'ansible_search_path' from source: unknown 28983 1726882992.78899: variable 'ansible_search_path' from source: unknown 28983 1726882992.79144: calling self._execute() 28983 1726882992.79260: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882992.79267: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882992.79289: variable 'omit' from source: magic vars 28983 1726882992.80137: variable 'ansible_distribution_major_version' from source: facts 28983 1726882992.80158: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882992.80161: _execute() done 28983 1726882992.80164: dumping result to json 28983 1726882992.80169: done dumping result, returning 28983 1726882992.80179: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' [0affe814-3a2d-b16d-c0a7-000000000642] 28983 1726882992.80186: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000642 28983 1726882992.80414: no more pending results, returning what we have 28983 1726882992.80419: in VariableManager get_vars() 28983 1726882992.80464: Calling all_inventory to load vars for managed_node2 28983 1726882992.80468: Calling groups_inventory to load vars for managed_node2 28983 1726882992.80472: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882992.80487: Calling all_plugins_play to load vars for managed_node2 28983 1726882992.80491: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882992.80494: Calling groups_plugins_play to load vars for managed_node2 28983 1726882992.81299: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000642 28983 1726882992.81303: WORKER PROCESS EXITING 28983 1726882992.84843: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882992.88378: done with get_vars() 28983 1726882992.88419: variable 'ansible_search_path' from source: unknown 28983 1726882992.88421: variable 'ansible_search_path' from source: unknown 28983 1726882992.88433: variable 'item' from source: include params 28983 1726882992.88571: variable 'item' from source: include params 28983 1726882992.88623: we have included files to process 28983 1726882992.88625: generating all_blocks data 28983 1726882992.88627: done generating all_blocks data 28983 1726882992.88632: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 28983 1726882992.88636: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 28983 1726882992.88639: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 28983 1726882992.88991: done processing included file 28983 1726882992.88993: iterating over new_blocks loaded from include file 28983 1726882992.88995: in VariableManager get_vars() 28983 1726882992.89025: done with get_vars() 28983 1726882992.89027: filtering new block on tags 28983 1726882992.89110: done filtering new block on tags 28983 1726882992.89113: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node2 28983 1726882992.89120: extending task lists for all hosts with included blocks 28983 1726882992.89384: done extending task lists 28983 1726882992.89385: done processing included files 28983 1726882992.89386: results queue empty 28983 1726882992.89387: checking for any_errors_fatal 28983 1726882992.89392: done checking for any_errors_fatal 28983 1726882992.89393: checking for max_fail_percentage 28983 1726882992.89395: done checking for max_fail_percentage 28983 1726882992.89396: checking to see if all hosts have failed and the running result is not ok 28983 1726882992.89397: done checking to see if all hosts have failed 28983 1726882992.89398: getting the remaining hosts for this loop 28983 1726882992.89400: done getting the remaining hosts for this loop 28983 1726882992.89403: getting the next task for host managed_node2 28983 1726882992.89408: done getting next task for host managed_node2 28983 1726882992.89411: ^ task is: TASK: Get stat for interface {{ interface }} 28983 1726882992.89415: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882992.89418: getting variables 28983 1726882992.89419: in VariableManager get_vars() 28983 1726882992.89430: Calling all_inventory to load vars for managed_node2 28983 1726882992.89433: Calling groups_inventory to load vars for managed_node2 28983 1726882992.89439: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882992.89446: Calling all_plugins_play to load vars for managed_node2 28983 1726882992.89449: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882992.89457: Calling groups_plugins_play to load vars for managed_node2 28983 1726882992.92207: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882992.96670: done with get_vars() 28983 1726882992.96821: done getting variables 28983 1726882992.97191: variable 'interface' from source: play vars TASK [Get stat for interface statebr] ****************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:43:12 -0400 (0:00:00.199) 0:00:22.970 ****** 28983 1726882992.97267: entering _queue_task() for managed_node2/stat 28983 1726882992.97726: worker is 1 (out of 1 available) 28983 1726882992.97741: exiting _queue_task() for managed_node2/stat 28983 1726882992.97755: done queuing things up, now waiting for results queue to drain 28983 1726882992.97757: waiting for pending results... 28983 1726882992.98108: running TaskExecutor() for managed_node2/TASK: Get stat for interface statebr 28983 1726882992.98286: in run() - task 0affe814-3a2d-b16d-c0a7-000000000691 28983 1726882992.98309: variable 'ansible_search_path' from source: unknown 28983 1726882992.98319: variable 'ansible_search_path' from source: unknown 28983 1726882992.98445: calling self._execute() 28983 1726882992.98494: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882992.98507: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882992.98526: variable 'omit' from source: magic vars 28983 1726882992.99008: variable 'ansible_distribution_major_version' from source: facts 28983 1726882992.99026: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882992.99040: variable 'omit' from source: magic vars 28983 1726882992.99124: variable 'omit' from source: magic vars 28983 1726882992.99257: variable 'interface' from source: play vars 28983 1726882992.99292: variable 'omit' from source: magic vars 28983 1726882992.99414: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726882992.99643: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726882992.99653: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726882992.99656: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882992.99659: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882992.99698: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726882992.99717: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882992.99728: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882992.99892: Set connection var ansible_connection to ssh 28983 1726882992.99912: Set connection var ansible_shell_executable to /bin/sh 28983 1726882992.99928: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726882992.99949: Set connection var ansible_timeout to 10 28983 1726882992.99978: Set connection var ansible_pipelining to False 28983 1726882992.99985: Set connection var ansible_shell_type to sh 28983 1726882993.00013: variable 'ansible_shell_executable' from source: unknown 28983 1726882993.00078: variable 'ansible_connection' from source: unknown 28983 1726882993.00087: variable 'ansible_module_compression' from source: unknown 28983 1726882993.00090: variable 'ansible_shell_type' from source: unknown 28983 1726882993.00093: variable 'ansible_shell_executable' from source: unknown 28983 1726882993.00096: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882993.00098: variable 'ansible_pipelining' from source: unknown 28983 1726882993.00100: variable 'ansible_timeout' from source: unknown 28983 1726882993.00103: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882993.00406: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28983 1726882993.00413: variable 'omit' from source: magic vars 28983 1726882993.00415: starting attempt loop 28983 1726882993.00418: running the handler 28983 1726882993.00420: _low_level_execute_command(): starting 28983 1726882993.00422: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726882993.01667: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882993.01926: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726882993.01969: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726882993.02001: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882993.02195: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882993.03966: stdout chunk (state=3): >>>/root <<< 28983 1726882993.04177: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882993.04201: stdout chunk (state=3): >>><<< 28983 1726882993.04221: stderr chunk (state=3): >>><<< 28983 1726882993.04572: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726882993.04577: _low_level_execute_command(): starting 28983 1726882993.04580: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882993.0446255-29877-180271973453369 `" && echo ansible-tmp-1726882993.0446255-29877-180271973453369="` echo /root/.ansible/tmp/ansible-tmp-1726882993.0446255-29877-180271973453369 `" ) && sleep 0' 28983 1726882993.05775: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726882993.05802: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726882993.05823: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882993.05871: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726882993.05940: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882993.06024: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882993.08068: stdout chunk (state=3): >>>ansible-tmp-1726882993.0446255-29877-180271973453369=/root/.ansible/tmp/ansible-tmp-1726882993.0446255-29877-180271973453369 <<< 28983 1726882993.08264: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882993.08267: stdout chunk (state=3): >>><<< 28983 1726882993.08270: stderr chunk (state=3): >>><<< 28983 1726882993.08440: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882993.0446255-29877-180271973453369=/root/.ansible/tmp/ansible-tmp-1726882993.0446255-29877-180271973453369 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726882993.08444: variable 'ansible_module_compression' from source: unknown 28983 1726882993.08446: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 28983 1726882993.08476: variable 'ansible_facts' from source: unknown 28983 1726882993.08570: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882993.0446255-29877-180271973453369/AnsiballZ_stat.py 28983 1726882993.08810: Sending initial data 28983 1726882993.08813: Sent initial data (153 bytes) 28983 1726882993.09913: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882993.10046: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726882993.10130: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726882993.10166: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882993.10277: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882993.11942: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726882993.12000: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726882993.12076: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpdtq954sv /root/.ansible/tmp/ansible-tmp-1726882993.0446255-29877-180271973453369/AnsiballZ_stat.py <<< 28983 1726882993.12079: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882993.0446255-29877-180271973453369/AnsiballZ_stat.py" <<< 28983 1726882993.12137: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpdtq954sv" to remote "/root/.ansible/tmp/ansible-tmp-1726882993.0446255-29877-180271973453369/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882993.0446255-29877-180271973453369/AnsiballZ_stat.py" <<< 28983 1726882993.13251: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882993.13255: stderr chunk (state=3): >>><<< 28983 1726882993.13257: stdout chunk (state=3): >>><<< 28983 1726882993.13261: done transferring module to remote 28983 1726882993.13266: _low_level_execute_command(): starting 28983 1726882993.13275: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882993.0446255-29877-180271973453369/ /root/.ansible/tmp/ansible-tmp-1726882993.0446255-29877-180271973453369/AnsiballZ_stat.py && sleep 0' 28983 1726882993.13867: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726882993.13881: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726882993.13920: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882993.13924: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882993.13927: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882993.13984: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726882993.14007: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882993.14074: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882993.15939: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882993.15988: stderr chunk (state=3): >>><<< 28983 1726882993.15990: stdout chunk (state=3): >>><<< 28983 1726882993.16002: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726882993.16042: _low_level_execute_command(): starting 28983 1726882993.16046: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882993.0446255-29877-180271973453369/AnsiballZ_stat.py && sleep 0' 28983 1726882993.16413: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726882993.16450: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726882993.16453: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882993.16455: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration <<< 28983 1726882993.16457: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726882993.16460: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882993.16517: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726882993.16520: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882993.16592: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882993.33841: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 28983 1726882993.35285: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. <<< 28983 1726882993.35329: stderr chunk (state=3): >>><<< 28983 1726882993.35332: stdout chunk (state=3): >>><<< 28983 1726882993.35370: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726882993.35415: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882993.0446255-29877-180271973453369/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726882993.35419: _low_level_execute_command(): starting 28983 1726882993.35427: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882993.0446255-29877-180271973453369/ > /dev/null 2>&1 && sleep 0' 28983 1726882993.36026: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882993.36030: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726882993.36032: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 28983 1726882993.36037: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found <<< 28983 1726882993.36049: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882993.36106: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726882993.36112: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882993.36180: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882993.38178: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882993.38210: stderr chunk (state=3): >>><<< 28983 1726882993.38217: stdout chunk (state=3): >>><<< 28983 1726882993.38235: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726882993.38250: handler run complete 28983 1726882993.38281: attempt loop complete, returning result 28983 1726882993.38284: _execute() done 28983 1726882993.38287: dumping result to json 28983 1726882993.38292: done dumping result, returning 28983 1726882993.38300: done running TaskExecutor() for managed_node2/TASK: Get stat for interface statebr [0affe814-3a2d-b16d-c0a7-000000000691] 28983 1726882993.38305: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000691 28983 1726882993.38449: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000691 28983 1726882993.38451: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 28983 1726882993.38585: no more pending results, returning what we have 28983 1726882993.38589: results queue empty 28983 1726882993.38590: checking for any_errors_fatal 28983 1726882993.38592: done checking for any_errors_fatal 28983 1726882993.38592: checking for max_fail_percentage 28983 1726882993.38595: done checking for max_fail_percentage 28983 1726882993.38596: checking to see if all hosts have failed and the running result is not ok 28983 1726882993.38597: done checking to see if all hosts have failed 28983 1726882993.38598: getting the remaining hosts for this loop 28983 1726882993.38599: done getting the remaining hosts for this loop 28983 1726882993.38604: getting the next task for host managed_node2 28983 1726882993.38613: done getting next task for host managed_node2 28983 1726882993.38616: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 28983 1726882993.38621: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882993.38625: getting variables 28983 1726882993.38626: in VariableManager get_vars() 28983 1726882993.38721: Calling all_inventory to load vars for managed_node2 28983 1726882993.38725: Calling groups_inventory to load vars for managed_node2 28983 1726882993.38729: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882993.38742: Calling all_plugins_play to load vars for managed_node2 28983 1726882993.38745: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882993.38749: Calling groups_plugins_play to load vars for managed_node2 28983 1726882993.40556: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882993.43183: done with get_vars() 28983 1726882993.43219: done getting variables 28983 1726882993.43301: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 28983 1726882993.43625: variable 'interface' from source: play vars TASK [Assert that the interface is absent - 'statebr'] ************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Friday 20 September 2024 21:43:13 -0400 (0:00:00.464) 0:00:23.434 ****** 28983 1726882993.43665: entering _queue_task() for managed_node2/assert 28983 1726882993.44030: worker is 1 (out of 1 available) 28983 1726882993.44046: exiting _queue_task() for managed_node2/assert 28983 1726882993.44060: done queuing things up, now waiting for results queue to drain 28983 1726882993.44062: waiting for pending results... 28983 1726882993.44330: running TaskExecutor() for managed_node2/TASK: Assert that the interface is absent - 'statebr' 28983 1726882993.44421: in run() - task 0affe814-3a2d-b16d-c0a7-000000000643 28983 1726882993.44438: variable 'ansible_search_path' from source: unknown 28983 1726882993.44444: variable 'ansible_search_path' from source: unknown 28983 1726882993.44477: calling self._execute() 28983 1726882993.44557: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882993.44564: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882993.44576: variable 'omit' from source: magic vars 28983 1726882993.44891: variable 'ansible_distribution_major_version' from source: facts 28983 1726882993.44904: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882993.44910: variable 'omit' from source: magic vars 28983 1726882993.44949: variable 'omit' from source: magic vars 28983 1726882993.45032: variable 'interface' from source: play vars 28983 1726882993.45048: variable 'omit' from source: magic vars 28983 1726882993.45088: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726882993.45122: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726882993.45141: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726882993.45158: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882993.45168: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882993.45197: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726882993.45202: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882993.45205: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882993.45293: Set connection var ansible_connection to ssh 28983 1726882993.45303: Set connection var ansible_shell_executable to /bin/sh 28983 1726882993.45312: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726882993.45323: Set connection var ansible_timeout to 10 28983 1726882993.45331: Set connection var ansible_pipelining to False 28983 1726882993.45334: Set connection var ansible_shell_type to sh 28983 1726882993.45355: variable 'ansible_shell_executable' from source: unknown 28983 1726882993.45359: variable 'ansible_connection' from source: unknown 28983 1726882993.45361: variable 'ansible_module_compression' from source: unknown 28983 1726882993.45364: variable 'ansible_shell_type' from source: unknown 28983 1726882993.45368: variable 'ansible_shell_executable' from source: unknown 28983 1726882993.45372: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882993.45379: variable 'ansible_pipelining' from source: unknown 28983 1726882993.45382: variable 'ansible_timeout' from source: unknown 28983 1726882993.45387: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882993.45505: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726882993.45514: variable 'omit' from source: magic vars 28983 1726882993.45520: starting attempt loop 28983 1726882993.45523: running the handler 28983 1726882993.45736: variable 'interface_stat' from source: set_fact 28983 1726882993.45740: Evaluated conditional (not interface_stat.stat.exists): True 28983 1726882993.45743: handler run complete 28983 1726882993.45745: attempt loop complete, returning result 28983 1726882993.45748: _execute() done 28983 1726882993.45750: dumping result to json 28983 1726882993.45752: done dumping result, returning 28983 1726882993.45754: done running TaskExecutor() for managed_node2/TASK: Assert that the interface is absent - 'statebr' [0affe814-3a2d-b16d-c0a7-000000000643] 28983 1726882993.45757: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000643 28983 1726882993.45841: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000643 28983 1726882993.45844: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 28983 1726882993.45921: no more pending results, returning what we have 28983 1726882993.45924: results queue empty 28983 1726882993.45925: checking for any_errors_fatal 28983 1726882993.45933: done checking for any_errors_fatal 28983 1726882993.45936: checking for max_fail_percentage 28983 1726882993.45938: done checking for max_fail_percentage 28983 1726882993.45939: checking to see if all hosts have failed and the running result is not ok 28983 1726882993.45940: done checking to see if all hosts have failed 28983 1726882993.45941: getting the remaining hosts for this loop 28983 1726882993.45942: done getting the remaining hosts for this loop 28983 1726882993.45946: getting the next task for host managed_node2 28983 1726882993.45955: done getting next task for host managed_node2 28983 1726882993.45958: ^ task is: TASK: Test 28983 1726882993.45961: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882993.45965: getting variables 28983 1726882993.45966: in VariableManager get_vars() 28983 1726882993.45996: Calling all_inventory to load vars for managed_node2 28983 1726882993.45999: Calling groups_inventory to load vars for managed_node2 28983 1726882993.46002: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882993.46011: Calling all_plugins_play to load vars for managed_node2 28983 1726882993.46015: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882993.46018: Calling groups_plugins_play to load vars for managed_node2 28983 1726882993.51401: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882993.57239: done with get_vars() 28983 1726882993.57284: done getting variables TASK [Test] ******************************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:30 Friday 20 September 2024 21:43:13 -0400 (0:00:00.137) 0:00:23.571 ****** 28983 1726882993.57413: entering _queue_task() for managed_node2/include_tasks 28983 1726882993.57891: worker is 1 (out of 1 available) 28983 1726882993.57905: exiting _queue_task() for managed_node2/include_tasks 28983 1726882993.57918: done queuing things up, now waiting for results queue to drain 28983 1726882993.57920: waiting for pending results... 28983 1726882993.58520: running TaskExecutor() for managed_node2/TASK: Test 28983 1726882993.58525: in run() - task 0affe814-3a2d-b16d-c0a7-0000000005b8 28983 1726882993.58741: variable 'ansible_search_path' from source: unknown 28983 1726882993.58746: variable 'ansible_search_path' from source: unknown 28983 1726882993.58749: variable 'lsr_test' from source: include params 28983 1726882993.59217: variable 'lsr_test' from source: include params 28983 1726882993.59291: variable 'omit' from source: magic vars 28983 1726882993.59667: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882993.59683: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882993.59695: variable 'omit' from source: magic vars 28983 1726882993.60384: variable 'ansible_distribution_major_version' from source: facts 28983 1726882993.60396: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882993.60403: variable 'item' from source: unknown 28983 1726882993.60739: variable 'item' from source: unknown 28983 1726882993.60743: variable 'item' from source: unknown 28983 1726882993.60793: variable 'item' from source: unknown 28983 1726882993.60935: dumping result to json 28983 1726882993.60940: done dumping result, returning 28983 1726882993.60943: done running TaskExecutor() for managed_node2/TASK: Test [0affe814-3a2d-b16d-c0a7-0000000005b8] 28983 1726882993.60946: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000005b8 28983 1726882993.60988: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000005b8 28983 1726882993.60992: WORKER PROCESS EXITING 28983 1726882993.61023: no more pending results, returning what we have 28983 1726882993.61030: in VariableManager get_vars() 28983 1726882993.61081: Calling all_inventory to load vars for managed_node2 28983 1726882993.61086: Calling groups_inventory to load vars for managed_node2 28983 1726882993.61091: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882993.61105: Calling all_plugins_play to load vars for managed_node2 28983 1726882993.61110: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882993.61114: Calling groups_plugins_play to load vars for managed_node2 28983 1726882993.65452: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882993.69189: done with get_vars() 28983 1726882993.69221: variable 'ansible_search_path' from source: unknown 28983 1726882993.69222: variable 'ansible_search_path' from source: unknown 28983 1726882993.69271: we have included files to process 28983 1726882993.69273: generating all_blocks data 28983 1726882993.69275: done generating all_blocks data 28983 1726882993.69279: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile_no_autoconnect.yml 28983 1726882993.69280: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile_no_autoconnect.yml 28983 1726882993.69284: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile_no_autoconnect.yml 28983 1726882993.69660: done processing included file 28983 1726882993.69662: iterating over new_blocks loaded from include file 28983 1726882993.69664: in VariableManager get_vars() 28983 1726882993.69682: done with get_vars() 28983 1726882993.69684: filtering new block on tags 28983 1726882993.69727: done filtering new block on tags 28983 1726882993.69729: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile_no_autoconnect.yml for managed_node2 => (item=tasks/create_bridge_profile_no_autoconnect.yml) 28983 1726882993.69738: extending task lists for all hosts with included blocks 28983 1726882993.70886: done extending task lists 28983 1726882993.70888: done processing included files 28983 1726882993.70889: results queue empty 28983 1726882993.70890: checking for any_errors_fatal 28983 1726882993.70893: done checking for any_errors_fatal 28983 1726882993.70894: checking for max_fail_percentage 28983 1726882993.70896: done checking for max_fail_percentage 28983 1726882993.70897: checking to see if all hosts have failed and the running result is not ok 28983 1726882993.70898: done checking to see if all hosts have failed 28983 1726882993.70899: getting the remaining hosts for this loop 28983 1726882993.70900: done getting the remaining hosts for this loop 28983 1726882993.70903: getting the next task for host managed_node2 28983 1726882993.70909: done getting next task for host managed_node2 28983 1726882993.70911: ^ task is: TASK: Include network role 28983 1726882993.70915: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882993.70918: getting variables 28983 1726882993.70919: in VariableManager get_vars() 28983 1726882993.70929: Calling all_inventory to load vars for managed_node2 28983 1726882993.70932: Calling groups_inventory to load vars for managed_node2 28983 1726882993.70937: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882993.70943: Calling all_plugins_play to load vars for managed_node2 28983 1726882993.70946: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882993.70950: Calling groups_plugins_play to load vars for managed_node2 28983 1726882993.73354: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882993.77160: done with get_vars() 28983 1726882993.77195: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile_no_autoconnect.yml:3 Friday 20 September 2024 21:43:13 -0400 (0:00:00.198) 0:00:23.770 ****** 28983 1726882993.77293: entering _queue_task() for managed_node2/include_role 28983 1726882993.78036: worker is 1 (out of 1 available) 28983 1726882993.78047: exiting _queue_task() for managed_node2/include_role 28983 1726882993.78059: done queuing things up, now waiting for results queue to drain 28983 1726882993.78061: waiting for pending results... 28983 1726882993.78428: running TaskExecutor() for managed_node2/TASK: Include network role 28983 1726882993.78842: in run() - task 0affe814-3a2d-b16d-c0a7-0000000006b1 28983 1726882993.78846: variable 'ansible_search_path' from source: unknown 28983 1726882993.78849: variable 'ansible_search_path' from source: unknown 28983 1726882993.78852: calling self._execute() 28983 1726882993.78855: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882993.79061: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882993.79065: variable 'omit' from source: magic vars 28983 1726882993.79988: variable 'ansible_distribution_major_version' from source: facts 28983 1726882993.80059: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882993.80072: _execute() done 28983 1726882993.80083: dumping result to json 28983 1726882993.80093: done dumping result, returning 28983 1726882993.80239: done running TaskExecutor() for managed_node2/TASK: Include network role [0affe814-3a2d-b16d-c0a7-0000000006b1] 28983 1726882993.80243: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000006b1 28983 1726882993.80330: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000006b1 28983 1726882993.80335: WORKER PROCESS EXITING 28983 1726882993.80369: no more pending results, returning what we have 28983 1726882993.80374: in VariableManager get_vars() 28983 1726882993.80413: Calling all_inventory to load vars for managed_node2 28983 1726882993.80417: Calling groups_inventory to load vars for managed_node2 28983 1726882993.80421: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882993.80437: Calling all_plugins_play to load vars for managed_node2 28983 1726882993.80442: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882993.80446: Calling groups_plugins_play to load vars for managed_node2 28983 1726882993.84108: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882993.87876: done with get_vars() 28983 1726882993.87916: variable 'ansible_search_path' from source: unknown 28983 1726882993.87918: variable 'ansible_search_path' from source: unknown 28983 1726882993.88388: variable 'omit' from source: magic vars 28983 1726882993.88671: variable 'omit' from source: magic vars 28983 1726882993.88693: variable 'omit' from source: magic vars 28983 1726882993.88698: we have included files to process 28983 1726882993.88699: generating all_blocks data 28983 1726882993.88701: done generating all_blocks data 28983 1726882993.88703: processing included file: fedora.linux_system_roles.network 28983 1726882993.88733: in VariableManager get_vars() 28983 1726882993.88751: done with get_vars() 28983 1726882993.88785: in VariableManager get_vars() 28983 1726882993.88804: done with get_vars() 28983 1726882993.89055: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 28983 1726882993.89423: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 28983 1726882993.89545: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 28983 1726882993.90627: in VariableManager get_vars() 28983 1726882993.90654: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 28983 1726882993.94570: iterating over new_blocks loaded from include file 28983 1726882993.94573: in VariableManager get_vars() 28983 1726882993.94595: done with get_vars() 28983 1726882993.94597: filtering new block on tags 28983 1726882993.95055: done filtering new block on tags 28983 1726882993.95060: in VariableManager get_vars() 28983 1726882993.95079: done with get_vars() 28983 1726882993.95082: filtering new block on tags 28983 1726882993.95105: done filtering new block on tags 28983 1726882993.95107: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node2 28983 1726882993.95114: extending task lists for all hosts with included blocks 28983 1726882993.95337: done extending task lists 28983 1726882993.95339: done processing included files 28983 1726882993.95340: results queue empty 28983 1726882993.95341: checking for any_errors_fatal 28983 1726882993.95346: done checking for any_errors_fatal 28983 1726882993.95347: checking for max_fail_percentage 28983 1726882993.95349: done checking for max_fail_percentage 28983 1726882993.95350: checking to see if all hosts have failed and the running result is not ok 28983 1726882993.95351: done checking to see if all hosts have failed 28983 1726882993.95352: getting the remaining hosts for this loop 28983 1726882993.95353: done getting the remaining hosts for this loop 28983 1726882993.95357: getting the next task for host managed_node2 28983 1726882993.95362: done getting next task for host managed_node2 28983 1726882993.95365: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 28983 1726882993.95369: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882993.95381: getting variables 28983 1726882993.95382: in VariableManager get_vars() 28983 1726882993.95397: Calling all_inventory to load vars for managed_node2 28983 1726882993.95400: Calling groups_inventory to load vars for managed_node2 28983 1726882993.95403: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882993.95409: Calling all_plugins_play to load vars for managed_node2 28983 1726882993.95412: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882993.95416: Calling groups_plugins_play to load vars for managed_node2 28983 1726882994.03073: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882994.06042: done with get_vars() 28983 1726882994.06079: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:43:14 -0400 (0:00:00.288) 0:00:24.059 ****** 28983 1726882994.06170: entering _queue_task() for managed_node2/include_tasks 28983 1726882994.06539: worker is 1 (out of 1 available) 28983 1726882994.06552: exiting _queue_task() for managed_node2/include_tasks 28983 1726882994.06568: done queuing things up, now waiting for results queue to drain 28983 1726882994.06570: waiting for pending results... 28983 1726882994.06958: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 28983 1726882994.07075: in run() - task 0affe814-3a2d-b16d-c0a7-00000000072f 28983 1726882994.07098: variable 'ansible_search_path' from source: unknown 28983 1726882994.07107: variable 'ansible_search_path' from source: unknown 28983 1726882994.07340: calling self._execute() 28983 1726882994.07346: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882994.07350: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882994.07353: variable 'omit' from source: magic vars 28983 1726882994.07731: variable 'ansible_distribution_major_version' from source: facts 28983 1726882994.07753: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882994.07763: _execute() done 28983 1726882994.07770: dumping result to json 28983 1726882994.07778: done dumping result, returning 28983 1726882994.07798: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affe814-3a2d-b16d-c0a7-00000000072f] 28983 1726882994.08016: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000072f 28983 1726882994.08102: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000072f 28983 1726882994.08106: WORKER PROCESS EXITING 28983 1726882994.08171: no more pending results, returning what we have 28983 1726882994.08177: in VariableManager get_vars() 28983 1726882994.08226: Calling all_inventory to load vars for managed_node2 28983 1726882994.08230: Calling groups_inventory to load vars for managed_node2 28983 1726882994.08233: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882994.08248: Calling all_plugins_play to load vars for managed_node2 28983 1726882994.08252: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882994.08256: Calling groups_plugins_play to load vars for managed_node2 28983 1726882994.12054: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882994.15102: done with get_vars() 28983 1726882994.15132: variable 'ansible_search_path' from source: unknown 28983 1726882994.15137: variable 'ansible_search_path' from source: unknown 28983 1726882994.15186: we have included files to process 28983 1726882994.15187: generating all_blocks data 28983 1726882994.15190: done generating all_blocks data 28983 1726882994.15194: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 28983 1726882994.15195: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 28983 1726882994.15197: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 28983 1726882994.15926: done processing included file 28983 1726882994.15928: iterating over new_blocks loaded from include file 28983 1726882994.15930: in VariableManager get_vars() 28983 1726882994.15964: done with get_vars() 28983 1726882994.15966: filtering new block on tags 28983 1726882994.16006: done filtering new block on tags 28983 1726882994.16010: in VariableManager get_vars() 28983 1726882994.16039: done with get_vars() 28983 1726882994.16041: filtering new block on tags 28983 1726882994.16103: done filtering new block on tags 28983 1726882994.16107: in VariableManager get_vars() 28983 1726882994.16139: done with get_vars() 28983 1726882994.16142: filtering new block on tags 28983 1726882994.16202: done filtering new block on tags 28983 1726882994.16205: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node2 28983 1726882994.16211: extending task lists for all hosts with included blocks 28983 1726882994.18868: done extending task lists 28983 1726882994.18870: done processing included files 28983 1726882994.18871: results queue empty 28983 1726882994.18872: checking for any_errors_fatal 28983 1726882994.18875: done checking for any_errors_fatal 28983 1726882994.18876: checking for max_fail_percentage 28983 1726882994.18877: done checking for max_fail_percentage 28983 1726882994.18878: checking to see if all hosts have failed and the running result is not ok 28983 1726882994.18885: done checking to see if all hosts have failed 28983 1726882994.18886: getting the remaining hosts for this loop 28983 1726882994.18888: done getting the remaining hosts for this loop 28983 1726882994.18892: getting the next task for host managed_node2 28983 1726882994.18898: done getting next task for host managed_node2 28983 1726882994.18901: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 28983 1726882994.18907: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882994.18918: getting variables 28983 1726882994.18919: in VariableManager get_vars() 28983 1726882994.18938: Calling all_inventory to load vars for managed_node2 28983 1726882994.18941: Calling groups_inventory to load vars for managed_node2 28983 1726882994.18944: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882994.18950: Calling all_plugins_play to load vars for managed_node2 28983 1726882994.18954: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882994.18958: Calling groups_plugins_play to load vars for managed_node2 28983 1726882994.21726: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882994.25285: done with get_vars() 28983 1726882994.25313: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:43:14 -0400 (0:00:00.192) 0:00:24.251 ****** 28983 1726882994.25387: entering _queue_task() for managed_node2/setup 28983 1726882994.25663: worker is 1 (out of 1 available) 28983 1726882994.25679: exiting _queue_task() for managed_node2/setup 28983 1726882994.25694: done queuing things up, now waiting for results queue to drain 28983 1726882994.25696: waiting for pending results... 28983 1726882994.25895: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 28983 1726882994.26017: in run() - task 0affe814-3a2d-b16d-c0a7-00000000078c 28983 1726882994.26035: variable 'ansible_search_path' from source: unknown 28983 1726882994.26041: variable 'ansible_search_path' from source: unknown 28983 1726882994.26074: calling self._execute() 28983 1726882994.26158: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882994.26162: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882994.26177: variable 'omit' from source: magic vars 28983 1726882994.26562: variable 'ansible_distribution_major_version' from source: facts 28983 1726882994.26566: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882994.26997: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726882994.29711: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726882994.29768: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726882994.29829: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726882994.30059: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726882994.30062: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726882994.30065: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726882994.30083: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726882994.30130: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726882994.30205: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726882994.30229: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726882994.30320: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726882994.30368: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726882994.30424: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726882994.30608: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726882994.30612: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726882994.30955: variable '__network_required_facts' from source: role '' defaults 28983 1726882994.31065: variable 'ansible_facts' from source: unknown 28983 1726882994.32368: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 28983 1726882994.32386: when evaluation is False, skipping this task 28983 1726882994.32394: _execute() done 28983 1726882994.32439: dumping result to json 28983 1726882994.32442: done dumping result, returning 28983 1726882994.32445: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affe814-3a2d-b16d-c0a7-00000000078c] 28983 1726882994.32447: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000078c 28983 1726882994.32707: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000078c 28983 1726882994.32712: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28983 1726882994.32771: no more pending results, returning what we have 28983 1726882994.32777: results queue empty 28983 1726882994.32778: checking for any_errors_fatal 28983 1726882994.32781: done checking for any_errors_fatal 28983 1726882994.32781: checking for max_fail_percentage 28983 1726882994.32784: done checking for max_fail_percentage 28983 1726882994.32785: checking to see if all hosts have failed and the running result is not ok 28983 1726882994.32786: done checking to see if all hosts have failed 28983 1726882994.32787: getting the remaining hosts for this loop 28983 1726882994.32789: done getting the remaining hosts for this loop 28983 1726882994.32794: getting the next task for host managed_node2 28983 1726882994.32812: done getting next task for host managed_node2 28983 1726882994.32817: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 28983 1726882994.32828: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882994.32852: getting variables 28983 1726882994.32855: in VariableManager get_vars() 28983 1726882994.32933: Calling all_inventory to load vars for managed_node2 28983 1726882994.32937: Calling groups_inventory to load vars for managed_node2 28983 1726882994.32979: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882994.32989: Calling all_plugins_play to load vars for managed_node2 28983 1726882994.32993: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882994.33002: Calling groups_plugins_play to load vars for managed_node2 28983 1726882994.35848: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882994.38328: done with get_vars() 28983 1726882994.38374: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:43:14 -0400 (0:00:00.131) 0:00:24.382 ****** 28983 1726882994.38493: entering _queue_task() for managed_node2/stat 28983 1726882994.38846: worker is 1 (out of 1 available) 28983 1726882994.38859: exiting _queue_task() for managed_node2/stat 28983 1726882994.38873: done queuing things up, now waiting for results queue to drain 28983 1726882994.38875: waiting for pending results... 28983 1726882994.39170: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 28983 1726882994.39296: in run() - task 0affe814-3a2d-b16d-c0a7-00000000078e 28983 1726882994.39310: variable 'ansible_search_path' from source: unknown 28983 1726882994.39314: variable 'ansible_search_path' from source: unknown 28983 1726882994.39348: calling self._execute() 28983 1726882994.39432: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882994.39438: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882994.39453: variable 'omit' from source: magic vars 28983 1726882994.39781: variable 'ansible_distribution_major_version' from source: facts 28983 1726882994.39785: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882994.39928: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726882994.40151: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726882994.40190: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726882994.40222: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726882994.40253: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726882994.40358: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726882994.40380: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726882994.40402: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726882994.40425: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726882994.40503: variable '__network_is_ostree' from source: set_fact 28983 1726882994.40509: Evaluated conditional (not __network_is_ostree is defined): False 28983 1726882994.40512: when evaluation is False, skipping this task 28983 1726882994.40517: _execute() done 28983 1726882994.40520: dumping result to json 28983 1726882994.40525: done dumping result, returning 28983 1726882994.40533: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affe814-3a2d-b16d-c0a7-00000000078e] 28983 1726882994.40540: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000078e 28983 1726882994.40636: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000078e 28983 1726882994.40639: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 28983 1726882994.40720: no more pending results, returning what we have 28983 1726882994.40724: results queue empty 28983 1726882994.40725: checking for any_errors_fatal 28983 1726882994.40733: done checking for any_errors_fatal 28983 1726882994.40736: checking for max_fail_percentage 28983 1726882994.40738: done checking for max_fail_percentage 28983 1726882994.40739: checking to see if all hosts have failed and the running result is not ok 28983 1726882994.40740: done checking to see if all hosts have failed 28983 1726882994.40741: getting the remaining hosts for this loop 28983 1726882994.40743: done getting the remaining hosts for this loop 28983 1726882994.40748: getting the next task for host managed_node2 28983 1726882994.40756: done getting next task for host managed_node2 28983 1726882994.40761: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 28983 1726882994.40768: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882994.40786: getting variables 28983 1726882994.40788: in VariableManager get_vars() 28983 1726882994.40821: Calling all_inventory to load vars for managed_node2 28983 1726882994.40824: Calling groups_inventory to load vars for managed_node2 28983 1726882994.40826: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882994.40843: Calling all_plugins_play to load vars for managed_node2 28983 1726882994.40846: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882994.40850: Calling groups_plugins_play to load vars for managed_node2 28983 1726882994.42092: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882994.43814: done with get_vars() 28983 1726882994.43839: done getting variables 28983 1726882994.43892: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:43:14 -0400 (0:00:00.054) 0:00:24.437 ****** 28983 1726882994.43922: entering _queue_task() for managed_node2/set_fact 28983 1726882994.44180: worker is 1 (out of 1 available) 28983 1726882994.44194: exiting _queue_task() for managed_node2/set_fact 28983 1726882994.44207: done queuing things up, now waiting for results queue to drain 28983 1726882994.44209: waiting for pending results... 28983 1726882994.44400: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 28983 1726882994.44523: in run() - task 0affe814-3a2d-b16d-c0a7-00000000078f 28983 1726882994.44541: variable 'ansible_search_path' from source: unknown 28983 1726882994.44547: variable 'ansible_search_path' from source: unknown 28983 1726882994.44581: calling self._execute() 28983 1726882994.44665: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882994.44670: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882994.44679: variable 'omit' from source: magic vars 28983 1726882994.44993: variable 'ansible_distribution_major_version' from source: facts 28983 1726882994.45002: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882994.45139: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726882994.45363: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726882994.45401: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726882994.45437: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726882994.45465: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726882994.45565: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726882994.45588: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726882994.45609: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726882994.45632: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726882994.45708: variable '__network_is_ostree' from source: set_fact 28983 1726882994.45714: Evaluated conditional (not __network_is_ostree is defined): False 28983 1726882994.45717: when evaluation is False, skipping this task 28983 1726882994.45720: _execute() done 28983 1726882994.45725: dumping result to json 28983 1726882994.45729: done dumping result, returning 28983 1726882994.45740: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affe814-3a2d-b16d-c0a7-00000000078f] 28983 1726882994.45745: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000078f 28983 1726882994.45843: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000078f 28983 1726882994.45847: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 28983 1726882994.45902: no more pending results, returning what we have 28983 1726882994.45906: results queue empty 28983 1726882994.45907: checking for any_errors_fatal 28983 1726882994.45913: done checking for any_errors_fatal 28983 1726882994.45913: checking for max_fail_percentage 28983 1726882994.45915: done checking for max_fail_percentage 28983 1726882994.45916: checking to see if all hosts have failed and the running result is not ok 28983 1726882994.45917: done checking to see if all hosts have failed 28983 1726882994.45918: getting the remaining hosts for this loop 28983 1726882994.45920: done getting the remaining hosts for this loop 28983 1726882994.45925: getting the next task for host managed_node2 28983 1726882994.45938: done getting next task for host managed_node2 28983 1726882994.45943: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 28983 1726882994.45949: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882994.45964: getting variables 28983 1726882994.45966: in VariableManager get_vars() 28983 1726882994.46000: Calling all_inventory to load vars for managed_node2 28983 1726882994.46003: Calling groups_inventory to load vars for managed_node2 28983 1726882994.46005: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882994.46014: Calling all_plugins_play to load vars for managed_node2 28983 1726882994.46017: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882994.46020: Calling groups_plugins_play to load vars for managed_node2 28983 1726882994.47236: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882994.48840: done with get_vars() 28983 1726882994.48865: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:43:14 -0400 (0:00:00.050) 0:00:24.487 ****** 28983 1726882994.48942: entering _queue_task() for managed_node2/service_facts 28983 1726882994.49165: worker is 1 (out of 1 available) 28983 1726882994.49181: exiting _queue_task() for managed_node2/service_facts 28983 1726882994.49193: done queuing things up, now waiting for results queue to drain 28983 1726882994.49195: waiting for pending results... 28983 1726882994.49377: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running 28983 1726882994.49485: in run() - task 0affe814-3a2d-b16d-c0a7-000000000791 28983 1726882994.49498: variable 'ansible_search_path' from source: unknown 28983 1726882994.49502: variable 'ansible_search_path' from source: unknown 28983 1726882994.49534: calling self._execute() 28983 1726882994.49611: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882994.49616: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882994.49627: variable 'omit' from source: magic vars 28983 1726882994.49935: variable 'ansible_distribution_major_version' from source: facts 28983 1726882994.49946: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882994.49953: variable 'omit' from source: magic vars 28983 1726882994.50023: variable 'omit' from source: magic vars 28983 1726882994.50056: variable 'omit' from source: magic vars 28983 1726882994.50096: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726882994.50126: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726882994.50146: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726882994.50161: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882994.50171: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882994.50201: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726882994.50205: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882994.50210: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882994.50291: Set connection var ansible_connection to ssh 28983 1726882994.50303: Set connection var ansible_shell_executable to /bin/sh 28983 1726882994.50313: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726882994.50321: Set connection var ansible_timeout to 10 28983 1726882994.50328: Set connection var ansible_pipelining to False 28983 1726882994.50331: Set connection var ansible_shell_type to sh 28983 1726882994.50353: variable 'ansible_shell_executable' from source: unknown 28983 1726882994.50356: variable 'ansible_connection' from source: unknown 28983 1726882994.50360: variable 'ansible_module_compression' from source: unknown 28983 1726882994.50362: variable 'ansible_shell_type' from source: unknown 28983 1726882994.50366: variable 'ansible_shell_executable' from source: unknown 28983 1726882994.50370: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882994.50377: variable 'ansible_pipelining' from source: unknown 28983 1726882994.50379: variable 'ansible_timeout' from source: unknown 28983 1726882994.50385: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882994.50541: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28983 1726882994.50551: variable 'omit' from source: magic vars 28983 1726882994.50556: starting attempt loop 28983 1726882994.50560: running the handler 28983 1726882994.50575: _low_level_execute_command(): starting 28983 1726882994.50581: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726882994.51126: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882994.51130: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882994.51135: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726882994.51138: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882994.51185: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726882994.51205: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882994.51284: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882994.53051: stdout chunk (state=3): >>>/root <<< 28983 1726882994.53162: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882994.53213: stderr chunk (state=3): >>><<< 28983 1726882994.53218: stdout chunk (state=3): >>><<< 28983 1726882994.53238: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726882994.53251: _low_level_execute_command(): starting 28983 1726882994.53257: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882994.5323935-29937-175090501010051 `" && echo ansible-tmp-1726882994.5323935-29937-175090501010051="` echo /root/.ansible/tmp/ansible-tmp-1726882994.5323935-29937-175090501010051 `" ) && sleep 0' 28983 1726882994.53726: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726882994.53730: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726882994.53732: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration <<< 28983 1726882994.53742: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726882994.53746: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882994.53797: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726882994.53804: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882994.53874: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882994.55923: stdout chunk (state=3): >>>ansible-tmp-1726882994.5323935-29937-175090501010051=/root/.ansible/tmp/ansible-tmp-1726882994.5323935-29937-175090501010051 <<< 28983 1726882994.56241: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882994.56245: stdout chunk (state=3): >>><<< 28983 1726882994.56248: stderr chunk (state=3): >>><<< 28983 1726882994.56251: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882994.5323935-29937-175090501010051=/root/.ansible/tmp/ansible-tmp-1726882994.5323935-29937-175090501010051 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726882994.56253: variable 'ansible_module_compression' from source: unknown 28983 1726882994.56256: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 28983 1726882994.56303: variable 'ansible_facts' from source: unknown 28983 1726882994.56426: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882994.5323935-29937-175090501010051/AnsiballZ_service_facts.py 28983 1726882994.56827: Sending initial data 28983 1726882994.56831: Sent initial data (162 bytes) 28983 1726882994.57249: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882994.57432: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726882994.57554: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882994.57781: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882994.59415: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 28983 1726882994.59427: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726882994.59493: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726882994.59561: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpe_j58_7a /root/.ansible/tmp/ansible-tmp-1726882994.5323935-29937-175090501010051/AnsiballZ_service_facts.py <<< 28983 1726882994.59565: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882994.5323935-29937-175090501010051/AnsiballZ_service_facts.py" <<< 28983 1726882994.59633: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpe_j58_7a" to remote "/root/.ansible/tmp/ansible-tmp-1726882994.5323935-29937-175090501010051/AnsiballZ_service_facts.py" <<< 28983 1726882994.59637: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882994.5323935-29937-175090501010051/AnsiballZ_service_facts.py" <<< 28983 1726882994.60660: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882994.60682: stderr chunk (state=3): >>><<< 28983 1726882994.60686: stdout chunk (state=3): >>><<< 28983 1726882994.60708: done transferring module to remote 28983 1726882994.60841: _low_level_execute_command(): starting 28983 1726882994.60844: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882994.5323935-29937-175090501010051/ /root/.ansible/tmp/ansible-tmp-1726882994.5323935-29937-175090501010051/AnsiballZ_service_facts.py && sleep 0' 28983 1726882994.61636: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726882994.61644: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726882994.61651: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726882994.61658: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882994.61680: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28983 1726882994.61687: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882994.61768: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726882994.61777: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726882994.61785: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882994.61882: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882994.63852: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882994.64039: stderr chunk (state=3): >>><<< 28983 1726882994.64043: stdout chunk (state=3): >>><<< 28983 1726882994.64046: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726882994.64049: _low_level_execute_command(): starting 28983 1726882994.64052: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882994.5323935-29937-175090501010051/AnsiballZ_service_facts.py && sleep 0' 28983 1726882994.64451: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726882994.64461: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726882994.64475: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882994.64494: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726882994.64501: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726882994.64510: stderr chunk (state=3): >>>debug2: match not found <<< 28983 1726882994.64527: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882994.64544: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28983 1726882994.64554: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.46.139 is address <<< 28983 1726882994.64562: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28983 1726882994.64571: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726882994.64582: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882994.64596: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726882994.64606: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726882994.64613: stderr chunk (state=3): >>>debug2: match found <<< 28983 1726882994.64624: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882994.64700: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726882994.64715: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726882994.64737: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882994.64842: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882996.63441: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"network": {"name": "network", "state": "running", "status": "enabled", "source": "sysv"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "import-state.service": {"name": "import-state.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "ip6tables.service": {"name": "ip6tables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "iptables.service": {"name": "iptables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "generated", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state":<<< 28983 1726882996.63536: stdout chunk (state=3): >>> "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "loadmodules.service": {"name": "loadmodules.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "stati<<< 28983 1726882996.63598: stdout chunk (state=3): >>>c", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 28983 1726882996.65156: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. <<< 28983 1726882996.65226: stderr chunk (state=3): >>><<< 28983 1726882996.65230: stdout chunk (state=3): >>><<< 28983 1726882996.65343: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"network": {"name": "network", "state": "running", "status": "enabled", "source": "sysv"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "import-state.service": {"name": "import-state.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "ip6tables.service": {"name": "ip6tables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "iptables.service": {"name": "iptables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "generated", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "loadmodules.service": {"name": "loadmodules.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726882996.67338: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882994.5323935-29937-175090501010051/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726882996.67363: _low_level_execute_command(): starting 28983 1726882996.67529: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882994.5323935-29937-175090501010051/ > /dev/null 2>&1 && sleep 0' 28983 1726882996.68275: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726882996.68279: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882996.68332: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726882996.68338: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882996.68409: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726882996.68454: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882996.68540: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882996.70532: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882996.70537: stdout chunk (state=3): >>><<< 28983 1726882996.70545: stderr chunk (state=3): >>><<< 28983 1726882996.70559: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726882996.70566: handler run complete 28983 1726882996.70741: variable 'ansible_facts' from source: unknown 28983 1726882996.70873: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882996.71877: variable 'ansible_facts' from source: unknown 28983 1726882996.72009: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882996.72216: attempt loop complete, returning result 28983 1726882996.72223: _execute() done 28983 1726882996.72227: dumping result to json 28983 1726882996.72278: done dumping result, returning 28983 1726882996.72285: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running [0affe814-3a2d-b16d-c0a7-000000000791] 28983 1726882996.72290: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000791 28983 1726882996.73725: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000791 28983 1726882996.73728: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28983 1726882996.73847: no more pending results, returning what we have 28983 1726882996.73851: results queue empty 28983 1726882996.73852: checking for any_errors_fatal 28983 1726882996.73856: done checking for any_errors_fatal 28983 1726882996.73857: checking for max_fail_percentage 28983 1726882996.73859: done checking for max_fail_percentage 28983 1726882996.73860: checking to see if all hosts have failed and the running result is not ok 28983 1726882996.73861: done checking to see if all hosts have failed 28983 1726882996.73862: getting the remaining hosts for this loop 28983 1726882996.73864: done getting the remaining hosts for this loop 28983 1726882996.73868: getting the next task for host managed_node2 28983 1726882996.73878: done getting next task for host managed_node2 28983 1726882996.73886: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 28983 1726882996.73894: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882996.73905: getting variables 28983 1726882996.73907: in VariableManager get_vars() 28983 1726882996.73939: Calling all_inventory to load vars for managed_node2 28983 1726882996.73943: Calling groups_inventory to load vars for managed_node2 28983 1726882996.73946: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882996.73955: Calling all_plugins_play to load vars for managed_node2 28983 1726882996.73958: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882996.73962: Calling groups_plugins_play to load vars for managed_node2 28983 1726882996.76184: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882996.79304: done with get_vars() 28983 1726882996.79346: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:43:16 -0400 (0:00:02.305) 0:00:26.792 ****** 28983 1726882996.79475: entering _queue_task() for managed_node2/package_facts 28983 1726882996.79829: worker is 1 (out of 1 available) 28983 1726882996.79845: exiting _queue_task() for managed_node2/package_facts 28983 1726882996.79859: done queuing things up, now waiting for results queue to drain 28983 1726882996.79861: waiting for pending results... 28983 1726882996.80223: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 28983 1726882996.80415: in run() - task 0affe814-3a2d-b16d-c0a7-000000000792 28983 1726882996.80641: variable 'ansible_search_path' from source: unknown 28983 1726882996.80645: variable 'ansible_search_path' from source: unknown 28983 1726882996.80647: calling self._execute() 28983 1726882996.80650: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882996.80654: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882996.80657: variable 'omit' from source: magic vars 28983 1726882996.81117: variable 'ansible_distribution_major_version' from source: facts 28983 1726882996.81139: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882996.81150: variable 'omit' from source: magic vars 28983 1726882996.81271: variable 'omit' from source: magic vars 28983 1726882996.81328: variable 'omit' from source: magic vars 28983 1726882996.81382: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726882996.81439: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726882996.81467: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726882996.81497: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882996.81513: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882996.81563: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726882996.81575: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882996.81641: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882996.81720: Set connection var ansible_connection to ssh 28983 1726882996.81743: Set connection var ansible_shell_executable to /bin/sh 28983 1726882996.81770: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726882996.81791: Set connection var ansible_timeout to 10 28983 1726882996.81804: Set connection var ansible_pipelining to False 28983 1726882996.81813: Set connection var ansible_shell_type to sh 28983 1726882996.81844: variable 'ansible_shell_executable' from source: unknown 28983 1726882996.81852: variable 'ansible_connection' from source: unknown 28983 1726882996.81869: variable 'ansible_module_compression' from source: unknown 28983 1726882996.81939: variable 'ansible_shell_type' from source: unknown 28983 1726882996.81943: variable 'ansible_shell_executable' from source: unknown 28983 1726882996.81945: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882996.81948: variable 'ansible_pipelining' from source: unknown 28983 1726882996.81950: variable 'ansible_timeout' from source: unknown 28983 1726882996.81952: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882996.82168: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28983 1726882996.82201: variable 'omit' from source: magic vars 28983 1726882996.82213: starting attempt loop 28983 1726882996.82221: running the handler 28983 1726882996.82243: _low_level_execute_command(): starting 28983 1726882996.82257: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726882996.83075: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726882996.83086: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726882996.83102: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882996.83115: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726882996.83129: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726882996.83238: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726882996.83250: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882996.83364: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882996.85144: stdout chunk (state=3): >>>/root <<< 28983 1726882996.85352: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882996.85355: stdout chunk (state=3): >>><<< 28983 1726882996.85358: stderr chunk (state=3): >>><<< 28983 1726882996.85382: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726882996.85405: _low_level_execute_command(): starting 28983 1726882996.85429: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882996.8539035-30022-187963364591714 `" && echo ansible-tmp-1726882996.8539035-30022-187963364591714="` echo /root/.ansible/tmp/ansible-tmp-1726882996.8539035-30022-187963364591714 `" ) && sleep 0' 28983 1726882996.86080: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726882996.86085: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726882996.86088: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882996.86091: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726882996.86094: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726882996.86103: stderr chunk (state=3): >>>debug2: match not found <<< 28983 1726882996.86168: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882996.86174: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28983 1726882996.86180: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.46.139 is address <<< 28983 1726882996.86185: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28983 1726882996.86188: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726882996.86190: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882996.86295: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726882996.86298: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882996.86369: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882996.88396: stdout chunk (state=3): >>>ansible-tmp-1726882996.8539035-30022-187963364591714=/root/.ansible/tmp/ansible-tmp-1726882996.8539035-30022-187963364591714 <<< 28983 1726882996.88520: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882996.88565: stderr chunk (state=3): >>><<< 28983 1726882996.88569: stdout chunk (state=3): >>><<< 28983 1726882996.88588: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882996.8539035-30022-187963364591714=/root/.ansible/tmp/ansible-tmp-1726882996.8539035-30022-187963364591714 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726882996.88625: variable 'ansible_module_compression' from source: unknown 28983 1726882996.88664: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 28983 1726882996.88721: variable 'ansible_facts' from source: unknown 28983 1726882996.88860: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882996.8539035-30022-187963364591714/AnsiballZ_package_facts.py 28983 1726882996.88978: Sending initial data 28983 1726882996.88981: Sent initial data (162 bytes) 28983 1726882996.89396: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726882996.89436: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726882996.89440: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726882996.89442: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882996.89446: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882996.89449: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882996.89517: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726882996.89523: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882996.89602: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882996.91262: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726882996.91325: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726882996.91393: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpwl_umg4i /root/.ansible/tmp/ansible-tmp-1726882996.8539035-30022-187963364591714/AnsiballZ_package_facts.py <<< 28983 1726882996.91396: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882996.8539035-30022-187963364591714/AnsiballZ_package_facts.py" <<< 28983 1726882996.91454: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpwl_umg4i" to remote "/root/.ansible/tmp/ansible-tmp-1726882996.8539035-30022-187963364591714/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882996.8539035-30022-187963364591714/AnsiballZ_package_facts.py" <<< 28983 1726882996.94135: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882996.94140: stdout chunk (state=3): >>><<< 28983 1726882996.94147: stderr chunk (state=3): >>><<< 28983 1726882996.94174: done transferring module to remote 28983 1726882996.94198: _low_level_execute_command(): starting 28983 1726882996.94202: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882996.8539035-30022-187963364591714/ /root/.ansible/tmp/ansible-tmp-1726882996.8539035-30022-187963364591714/AnsiballZ_package_facts.py && sleep 0' 28983 1726882996.94706: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726882996.94716: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882996.94740: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882996.94750: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882996.94777: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882996.94814: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726882996.94829: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882996.94900: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882996.97040: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882996.97044: stdout chunk (state=3): >>><<< 28983 1726882996.97046: stderr chunk (state=3): >>><<< 28983 1726882996.97049: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726882996.97051: _low_level_execute_command(): starting 28983 1726882996.97054: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882996.8539035-30022-187963364591714/AnsiballZ_package_facts.py && sleep 0' 28983 1726882996.98256: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 28983 1726882996.98362: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882996.98466: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882997.61780: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "a<<< 28983 1726882997.61950: stdout chunk (state=3): >>>rch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "relea<<< 28983 1726882997.61967: stdout chunk (state=3): >>>se": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-<<< 28983 1726882997.61972: stdout chunk (state=3): >>>libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, <<< 28983 1726882997.61977: stdout chunk (state=3): >>>"arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release"<<< 28983 1726882997.61981: stdout chunk (state=3): >>>: "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", <<< 28983 1726882997.62087: stdout chunk (state=3): >>>"release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "9.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chkconfig": [{"name": "chkconfig", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts": [{"name": "initscripts", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "network-scripts": [{"name": "network-scripts", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 28983 1726882997.63879: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. <<< 28983 1726882997.63969: stderr chunk (state=3): >>><<< 28983 1726882997.63980: stdout chunk (state=3): >>><<< 28983 1726882997.64036: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "9.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chkconfig": [{"name": "chkconfig", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts": [{"name": "initscripts", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "network-scripts": [{"name": "network-scripts", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726882997.70305: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882996.8539035-30022-187963364591714/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726882997.70309: _low_level_execute_command(): starting 28983 1726882997.70312: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882996.8539035-30022-187963364591714/ > /dev/null 2>&1 && sleep 0' 28983 1726882997.71658: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726882997.71663: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882997.71805: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726882997.71824: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726882997.72120: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882997.72347: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882997.74358: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882997.74440: stderr chunk (state=3): >>><<< 28983 1726882997.74476: stdout chunk (state=3): >>><<< 28983 1726882997.74506: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726882997.74515: handler run complete 28983 1726882997.76264: variable 'ansible_facts' from source: unknown 28983 1726882997.77359: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882997.81339: variable 'ansible_facts' from source: unknown 28983 1726882997.82224: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882997.83785: attempt loop complete, returning result 28983 1726882997.83806: _execute() done 28983 1726882997.83809: dumping result to json 28983 1726882997.84308: done dumping result, returning 28983 1726882997.84312: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affe814-3a2d-b16d-c0a7-000000000792] 28983 1726882997.84315: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000792 28983 1726882997.88078: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000792 28983 1726882997.88082: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28983 1726882997.88269: no more pending results, returning what we have 28983 1726882997.88275: results queue empty 28983 1726882997.88276: checking for any_errors_fatal 28983 1726882997.88281: done checking for any_errors_fatal 28983 1726882997.88282: checking for max_fail_percentage 28983 1726882997.88284: done checking for max_fail_percentage 28983 1726882997.88286: checking to see if all hosts have failed and the running result is not ok 28983 1726882997.88287: done checking to see if all hosts have failed 28983 1726882997.88287: getting the remaining hosts for this loop 28983 1726882997.88289: done getting the remaining hosts for this loop 28983 1726882997.88294: getting the next task for host managed_node2 28983 1726882997.88302: done getting next task for host managed_node2 28983 1726882997.88307: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 28983 1726882997.88313: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882997.88325: getting variables 28983 1726882997.88327: in VariableManager get_vars() 28983 1726882997.88384: Calling all_inventory to load vars for managed_node2 28983 1726882997.88387: Calling groups_inventory to load vars for managed_node2 28983 1726882997.88391: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882997.88400: Calling all_plugins_play to load vars for managed_node2 28983 1726882997.88404: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882997.88408: Calling groups_plugins_play to load vars for managed_node2 28983 1726882997.90764: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882997.93746: done with get_vars() 28983 1726882997.93783: done getting variables 28983 1726882997.93857: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:43:17 -0400 (0:00:01.144) 0:00:27.936 ****** 28983 1726882997.93903: entering _queue_task() for managed_node2/debug 28983 1726882997.94279: worker is 1 (out of 1 available) 28983 1726882997.94293: exiting _queue_task() for managed_node2/debug 28983 1726882997.94304: done queuing things up, now waiting for results queue to drain 28983 1726882997.94306: waiting for pending results... 28983 1726882997.94614: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 28983 1726882997.94711: in run() - task 0affe814-3a2d-b16d-c0a7-000000000730 28983 1726882997.94728: variable 'ansible_search_path' from source: unknown 28983 1726882997.94732: variable 'ansible_search_path' from source: unknown 28983 1726882997.94776: calling self._execute() 28983 1726882997.94883: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882997.94929: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882997.94932: variable 'omit' from source: magic vars 28983 1726882997.95331: variable 'ansible_distribution_major_version' from source: facts 28983 1726882997.95347: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882997.95360: variable 'omit' from source: magic vars 28983 1726882997.95443: variable 'omit' from source: magic vars 28983 1726882997.95581: variable 'network_provider' from source: set_fact 28983 1726882997.95585: variable 'omit' from source: magic vars 28983 1726882997.95631: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726882997.95690: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726882997.95697: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726882997.95727: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882997.95799: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882997.95803: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726882997.95806: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882997.95808: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882997.95908: Set connection var ansible_connection to ssh 28983 1726882997.95920: Set connection var ansible_shell_executable to /bin/sh 28983 1726882997.95932: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726882997.95948: Set connection var ansible_timeout to 10 28983 1726882997.96018: Set connection var ansible_pipelining to False 28983 1726882997.96021: Set connection var ansible_shell_type to sh 28983 1726882997.96024: variable 'ansible_shell_executable' from source: unknown 28983 1726882997.96027: variable 'ansible_connection' from source: unknown 28983 1726882997.96029: variable 'ansible_module_compression' from source: unknown 28983 1726882997.96032: variable 'ansible_shell_type' from source: unknown 28983 1726882997.96036: variable 'ansible_shell_executable' from source: unknown 28983 1726882997.96038: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882997.96041: variable 'ansible_pipelining' from source: unknown 28983 1726882997.96043: variable 'ansible_timeout' from source: unknown 28983 1726882997.96046: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882997.96185: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726882997.96198: variable 'omit' from source: magic vars 28983 1726882997.96205: starting attempt loop 28983 1726882997.96208: running the handler 28983 1726882997.96274: handler run complete 28983 1726882997.96299: attempt loop complete, returning result 28983 1726882997.96303: _execute() done 28983 1726882997.96306: dumping result to json 28983 1726882997.96308: done dumping result, returning 28983 1726882997.96311: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [0affe814-3a2d-b16d-c0a7-000000000730] 28983 1726882997.96313: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000730 28983 1726882997.96412: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000730 28983 1726882997.96416: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: Using network provider: nm 28983 1726882997.96491: no more pending results, returning what we have 28983 1726882997.96495: results queue empty 28983 1726882997.96496: checking for any_errors_fatal 28983 1726882997.96508: done checking for any_errors_fatal 28983 1726882997.96509: checking for max_fail_percentage 28983 1726882997.96511: done checking for max_fail_percentage 28983 1726882997.96512: checking to see if all hosts have failed and the running result is not ok 28983 1726882997.96513: done checking to see if all hosts have failed 28983 1726882997.96514: getting the remaining hosts for this loop 28983 1726882997.96517: done getting the remaining hosts for this loop 28983 1726882997.96522: getting the next task for host managed_node2 28983 1726882997.96532: done getting next task for host managed_node2 28983 1726882997.96541: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 28983 1726882997.96547: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882997.96561: getting variables 28983 1726882997.96562: in VariableManager get_vars() 28983 1726882997.96601: Calling all_inventory to load vars for managed_node2 28983 1726882997.96604: Calling groups_inventory to load vars for managed_node2 28983 1726882997.96607: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882997.96619: Calling all_plugins_play to load vars for managed_node2 28983 1726882997.96623: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882997.96627: Calling groups_plugins_play to load vars for managed_node2 28983 1726882997.98890: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882998.02467: done with get_vars() 28983 1726882998.02504: done getting variables 28983 1726882998.02885: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:43:18 -0400 (0:00:00.090) 0:00:28.027 ****** 28983 1726882998.02931: entering _queue_task() for managed_node2/fail 28983 1726882998.03312: worker is 1 (out of 1 available) 28983 1726882998.03326: exiting _queue_task() for managed_node2/fail 28983 1726882998.03340: done queuing things up, now waiting for results queue to drain 28983 1726882998.03342: waiting for pending results... 28983 1726882998.03754: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 28983 1726882998.03802: in run() - task 0affe814-3a2d-b16d-c0a7-000000000731 28983 1726882998.03825: variable 'ansible_search_path' from source: unknown 28983 1726882998.03838: variable 'ansible_search_path' from source: unknown 28983 1726882998.03941: calling self._execute() 28983 1726882998.04003: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882998.04017: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882998.04036: variable 'omit' from source: magic vars 28983 1726882998.04486: variable 'ansible_distribution_major_version' from source: facts 28983 1726882998.04510: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882998.04673: variable 'network_state' from source: role '' defaults 28983 1726882998.04691: Evaluated conditional (network_state != {}): False 28983 1726882998.04718: when evaluation is False, skipping this task 28983 1726882998.04722: _execute() done 28983 1726882998.04725: dumping result to json 28983 1726882998.04727: done dumping result, returning 28983 1726882998.04738: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affe814-3a2d-b16d-c0a7-000000000731] 28983 1726882998.04828: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000731 28983 1726882998.04909: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000731 28983 1726882998.04913: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28983 1726882998.04989: no more pending results, returning what we have 28983 1726882998.04994: results queue empty 28983 1726882998.04995: checking for any_errors_fatal 28983 1726882998.05004: done checking for any_errors_fatal 28983 1726882998.05005: checking for max_fail_percentage 28983 1726882998.05008: done checking for max_fail_percentage 28983 1726882998.05009: checking to see if all hosts have failed and the running result is not ok 28983 1726882998.05010: done checking to see if all hosts have failed 28983 1726882998.05011: getting the remaining hosts for this loop 28983 1726882998.05013: done getting the remaining hosts for this loop 28983 1726882998.05018: getting the next task for host managed_node2 28983 1726882998.05028: done getting next task for host managed_node2 28983 1726882998.05035: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 28983 1726882998.05042: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882998.05076: getting variables 28983 1726882998.05078: in VariableManager get_vars() 28983 1726882998.05119: Calling all_inventory to load vars for managed_node2 28983 1726882998.05123: Calling groups_inventory to load vars for managed_node2 28983 1726882998.05126: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882998.05541: Calling all_plugins_play to load vars for managed_node2 28983 1726882998.05546: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882998.05551: Calling groups_plugins_play to load vars for managed_node2 28983 1726882998.08330: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882998.11991: done with get_vars() 28983 1726882998.12025: done getting variables 28983 1726882998.12096: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:43:18 -0400 (0:00:00.092) 0:00:28.119 ****** 28983 1726882998.12138: entering _queue_task() for managed_node2/fail 28983 1726882998.12444: worker is 1 (out of 1 available) 28983 1726882998.12460: exiting _queue_task() for managed_node2/fail 28983 1726882998.12475: done queuing things up, now waiting for results queue to drain 28983 1726882998.12478: waiting for pending results... 28983 1726882998.12810: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 28983 1726882998.12945: in run() - task 0affe814-3a2d-b16d-c0a7-000000000732 28983 1726882998.12970: variable 'ansible_search_path' from source: unknown 28983 1726882998.12980: variable 'ansible_search_path' from source: unknown 28983 1726882998.13125: calling self._execute() 28983 1726882998.13146: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882998.13159: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882998.13176: variable 'omit' from source: magic vars 28983 1726882998.13703: variable 'ansible_distribution_major_version' from source: facts 28983 1726882998.13740: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882998.14045: variable 'network_state' from source: role '' defaults 28983 1726882998.14049: Evaluated conditional (network_state != {}): False 28983 1726882998.14052: when evaluation is False, skipping this task 28983 1726882998.14054: _execute() done 28983 1726882998.14056: dumping result to json 28983 1726882998.14058: done dumping result, returning 28983 1726882998.14061: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affe814-3a2d-b16d-c0a7-000000000732] 28983 1726882998.14063: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000732 skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28983 1726882998.14339: no more pending results, returning what we have 28983 1726882998.14343: results queue empty 28983 1726882998.14345: checking for any_errors_fatal 28983 1726882998.14356: done checking for any_errors_fatal 28983 1726882998.14357: checking for max_fail_percentage 28983 1726882998.14359: done checking for max_fail_percentage 28983 1726882998.14360: checking to see if all hosts have failed and the running result is not ok 28983 1726882998.14361: done checking to see if all hosts have failed 28983 1726882998.14362: getting the remaining hosts for this loop 28983 1726882998.14364: done getting the remaining hosts for this loop 28983 1726882998.14370: getting the next task for host managed_node2 28983 1726882998.14538: done getting next task for host managed_node2 28983 1726882998.14545: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 28983 1726882998.14551: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882998.14571: getting variables 28983 1726882998.14573: in VariableManager get_vars() 28983 1726882998.14610: Calling all_inventory to load vars for managed_node2 28983 1726882998.14614: Calling groups_inventory to load vars for managed_node2 28983 1726882998.14617: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882998.14626: Calling all_plugins_play to load vars for managed_node2 28983 1726882998.14630: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882998.14654: Calling groups_plugins_play to load vars for managed_node2 28983 1726882998.14667: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000732 28983 1726882998.14670: WORKER PROCESS EXITING 28983 1726882998.17153: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882998.20472: done with get_vars() 28983 1726882998.20512: done getting variables 28983 1726882998.20578: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:43:18 -0400 (0:00:00.084) 0:00:28.203 ****** 28983 1726882998.20622: entering _queue_task() for managed_node2/fail 28983 1726882998.20923: worker is 1 (out of 1 available) 28983 1726882998.21141: exiting _queue_task() for managed_node2/fail 28983 1726882998.21154: done queuing things up, now waiting for results queue to drain 28983 1726882998.21157: waiting for pending results... 28983 1726882998.21278: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 28983 1726882998.21470: in run() - task 0affe814-3a2d-b16d-c0a7-000000000733 28983 1726882998.21501: variable 'ansible_search_path' from source: unknown 28983 1726882998.21513: variable 'ansible_search_path' from source: unknown 28983 1726882998.21559: calling self._execute() 28983 1726882998.21711: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882998.21715: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882998.21718: variable 'omit' from source: magic vars 28983 1726882998.22177: variable 'ansible_distribution_major_version' from source: facts 28983 1726882998.22196: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882998.22437: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726882998.25440: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726882998.25444: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726882998.25447: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726882998.25449: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726882998.25452: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726882998.25512: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726882998.25553: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726882998.25597: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726882998.25655: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726882998.25686: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726882998.25808: variable 'ansible_distribution_major_version' from source: facts 28983 1726882998.25830: Evaluated conditional (ansible_distribution_major_version | int > 9): True 28983 1726882998.25988: variable 'ansible_distribution' from source: facts 28983 1726882998.26004: variable '__network_rh_distros' from source: role '' defaults 28983 1726882998.26019: Evaluated conditional (ansible_distribution in __network_rh_distros): False 28983 1726882998.26027: when evaluation is False, skipping this task 28983 1726882998.26036: _execute() done 28983 1726882998.26045: dumping result to json 28983 1726882998.26054: done dumping result, returning 28983 1726882998.26066: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affe814-3a2d-b16d-c0a7-000000000733] 28983 1726882998.26078: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000733 skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution in __network_rh_distros", "skip_reason": "Conditional result was False" } 28983 1726882998.26277: no more pending results, returning what we have 28983 1726882998.26281: results queue empty 28983 1726882998.26282: checking for any_errors_fatal 28983 1726882998.26289: done checking for any_errors_fatal 28983 1726882998.26290: checking for max_fail_percentage 28983 1726882998.26292: done checking for max_fail_percentage 28983 1726882998.26293: checking to see if all hosts have failed and the running result is not ok 28983 1726882998.26294: done checking to see if all hosts have failed 28983 1726882998.26295: getting the remaining hosts for this loop 28983 1726882998.26298: done getting the remaining hosts for this loop 28983 1726882998.26304: getting the next task for host managed_node2 28983 1726882998.26314: done getting next task for host managed_node2 28983 1726882998.26319: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 28983 1726882998.26439: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882998.26463: getting variables 28983 1726882998.26465: in VariableManager get_vars() 28983 1726882998.26506: Calling all_inventory to load vars for managed_node2 28983 1726882998.26509: Calling groups_inventory to load vars for managed_node2 28983 1726882998.26513: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882998.26523: Calling all_plugins_play to load vars for managed_node2 28983 1726882998.26528: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882998.26531: Calling groups_plugins_play to load vars for managed_node2 28983 1726882998.27280: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000733 28983 1726882998.27285: WORKER PROCESS EXITING 28983 1726882998.28991: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882998.33642: done with get_vars() 28983 1726882998.33690: done getting variables 28983 1726882998.33768: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:43:18 -0400 (0:00:00.131) 0:00:28.335 ****** 28983 1726882998.33807: entering _queue_task() for managed_node2/dnf 28983 1726882998.34221: worker is 1 (out of 1 available) 28983 1726882998.34305: exiting _queue_task() for managed_node2/dnf 28983 1726882998.34318: done queuing things up, now waiting for results queue to drain 28983 1726882998.34320: waiting for pending results... 28983 1726882998.34525: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 28983 1726882998.34716: in run() - task 0affe814-3a2d-b16d-c0a7-000000000734 28983 1726882998.34744: variable 'ansible_search_path' from source: unknown 28983 1726882998.34765: variable 'ansible_search_path' from source: unknown 28983 1726882998.34811: calling self._execute() 28983 1726882998.34930: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882998.34949: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882998.34977: variable 'omit' from source: magic vars 28983 1726882998.35443: variable 'ansible_distribution_major_version' from source: facts 28983 1726882998.35515: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882998.35751: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726882998.38869: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726882998.38960: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726882998.39013: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726882998.39062: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726882998.39098: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726882998.39200: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726882998.39336: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726882998.39341: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726882998.39348: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726882998.39370: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726882998.39510: variable 'ansible_distribution' from source: facts 28983 1726882998.39522: variable 'ansible_distribution_major_version' from source: facts 28983 1726882998.39538: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 28983 1726882998.39697: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726882998.39892: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726882998.39928: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726882998.39966: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726882998.40029: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726882998.40055: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726882998.40118: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726882998.40153: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726882998.40187: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726882998.40315: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726882998.40318: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726882998.40330: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726882998.40367: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726882998.40403: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726882998.40468: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726882998.40490: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726882998.40708: variable 'network_connections' from source: include params 28983 1726882998.40726: variable 'interface' from source: play vars 28983 1726882998.40860: variable 'interface' from source: play vars 28983 1726882998.40915: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726882998.41150: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726882998.41208: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726882998.41251: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726882998.41297: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726882998.41360: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726882998.41391: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726882998.41643: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726882998.41646: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726882998.41649: variable '__network_team_connections_defined' from source: role '' defaults 28983 1726882998.41893: variable 'network_connections' from source: include params 28983 1726882998.41904: variable 'interface' from source: play vars 28983 1726882998.41988: variable 'interface' from source: play vars 28983 1726882998.42030: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 28983 1726882998.42041: when evaluation is False, skipping this task 28983 1726882998.42049: _execute() done 28983 1726882998.42057: dumping result to json 28983 1726882998.42065: done dumping result, returning 28983 1726882998.42077: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affe814-3a2d-b16d-c0a7-000000000734] 28983 1726882998.42095: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000734 skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 28983 1726882998.42270: no more pending results, returning what we have 28983 1726882998.42274: results queue empty 28983 1726882998.42275: checking for any_errors_fatal 28983 1726882998.42284: done checking for any_errors_fatal 28983 1726882998.42285: checking for max_fail_percentage 28983 1726882998.42287: done checking for max_fail_percentage 28983 1726882998.42288: checking to see if all hosts have failed and the running result is not ok 28983 1726882998.42289: done checking to see if all hosts have failed 28983 1726882998.42290: getting the remaining hosts for this loop 28983 1726882998.42292: done getting the remaining hosts for this loop 28983 1726882998.42298: getting the next task for host managed_node2 28983 1726882998.42315: done getting next task for host managed_node2 28983 1726882998.42321: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 28983 1726882998.42327: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882998.42350: getting variables 28983 1726882998.42352: in VariableManager get_vars() 28983 1726882998.42393: Calling all_inventory to load vars for managed_node2 28983 1726882998.42397: Calling groups_inventory to load vars for managed_node2 28983 1726882998.42400: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882998.42411: Calling all_plugins_play to load vars for managed_node2 28983 1726882998.42546: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882998.42552: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000734 28983 1726882998.42556: WORKER PROCESS EXITING 28983 1726882998.42560: Calling groups_plugins_play to load vars for managed_node2 28983 1726882998.45279: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882998.48397: done with get_vars() 28983 1726882998.48463: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 28983 1726882998.48565: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:43:18 -0400 (0:00:00.147) 0:00:28.483 ****** 28983 1726882998.48605: entering _queue_task() for managed_node2/yum 28983 1726882998.49054: worker is 1 (out of 1 available) 28983 1726882998.49438: exiting _queue_task() for managed_node2/yum 28983 1726882998.49453: done queuing things up, now waiting for results queue to drain 28983 1726882998.49455: waiting for pending results... 28983 1726882998.49756: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 28983 1726882998.50129: in run() - task 0affe814-3a2d-b16d-c0a7-000000000735 28983 1726882998.50156: variable 'ansible_search_path' from source: unknown 28983 1726882998.50242: variable 'ansible_search_path' from source: unknown 28983 1726882998.50290: calling self._execute() 28983 1726882998.50491: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882998.50504: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882998.50521: variable 'omit' from source: magic vars 28983 1726882998.51014: variable 'ansible_distribution_major_version' from source: facts 28983 1726882998.51032: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882998.51265: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726882998.55377: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726882998.55478: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726882998.55539: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726882998.55589: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726882998.55629: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726882998.55740: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726882998.55969: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726882998.55975: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726882998.56064: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726882998.56095: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726882998.56443: variable 'ansible_distribution_major_version' from source: facts 28983 1726882998.56451: Evaluated conditional (ansible_distribution_major_version | int < 8): False 28983 1726882998.56459: when evaluation is False, skipping this task 28983 1726882998.56466: _execute() done 28983 1726882998.56477: dumping result to json 28983 1726882998.56487: done dumping result, returning 28983 1726882998.56499: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affe814-3a2d-b16d-c0a7-000000000735] 28983 1726882998.56510: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000735 skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 28983 1726882998.56731: no more pending results, returning what we have 28983 1726882998.56738: results queue empty 28983 1726882998.56739: checking for any_errors_fatal 28983 1726882998.56747: done checking for any_errors_fatal 28983 1726882998.56748: checking for max_fail_percentage 28983 1726882998.56750: done checking for max_fail_percentage 28983 1726882998.56751: checking to see if all hosts have failed and the running result is not ok 28983 1726882998.56752: done checking to see if all hosts have failed 28983 1726882998.56753: getting the remaining hosts for this loop 28983 1726882998.56756: done getting the remaining hosts for this loop 28983 1726882998.56761: getting the next task for host managed_node2 28983 1726882998.56951: done getting next task for host managed_node2 28983 1726882998.56958: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 28983 1726882998.56964: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882998.56987: getting variables 28983 1726882998.56989: in VariableManager get_vars() 28983 1726882998.57029: Calling all_inventory to load vars for managed_node2 28983 1726882998.57032: Calling groups_inventory to load vars for managed_node2 28983 1726882998.57038: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882998.57053: Calling all_plugins_play to load vars for managed_node2 28983 1726882998.57057: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882998.57060: Calling groups_plugins_play to load vars for managed_node2 28983 1726882998.57751: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000735 28983 1726882998.57755: WORKER PROCESS EXITING 28983 1726882998.59530: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882998.67880: done with get_vars() 28983 1726882998.67924: done getting variables 28983 1726882998.67997: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:43:18 -0400 (0:00:00.194) 0:00:28.678 ****** 28983 1726882998.68033: entering _queue_task() for managed_node2/fail 28983 1726882998.68428: worker is 1 (out of 1 available) 28983 1726882998.68446: exiting _queue_task() for managed_node2/fail 28983 1726882998.68459: done queuing things up, now waiting for results queue to drain 28983 1726882998.68462: waiting for pending results... 28983 1726882998.68714: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 28983 1726882998.68851: in run() - task 0affe814-3a2d-b16d-c0a7-000000000736 28983 1726882998.68865: variable 'ansible_search_path' from source: unknown 28983 1726882998.68871: variable 'ansible_search_path' from source: unknown 28983 1726882998.68910: calling self._execute() 28983 1726882998.68989: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882998.68997: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882998.69012: variable 'omit' from source: magic vars 28983 1726882998.69349: variable 'ansible_distribution_major_version' from source: facts 28983 1726882998.69361: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882998.69465: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726882998.69632: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726882998.72019: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726882998.72170: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726882998.72185: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726882998.72256: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726882998.72312: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726882998.72441: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726882998.72468: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726882998.72494: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726882998.72530: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726882998.72545: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726882998.72592: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726882998.72613: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726882998.72637: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726882998.72669: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726882998.72684: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726882998.72721: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726882998.72743: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726882998.72763: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726882998.72797: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726882998.72809: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726882998.72963: variable 'network_connections' from source: include params 28983 1726882998.72974: variable 'interface' from source: play vars 28983 1726882998.73032: variable 'interface' from source: play vars 28983 1726882998.73097: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726882998.73230: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726882998.73275: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726882998.73304: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726882998.73329: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726882998.73366: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726882998.73390: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726882998.73411: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726882998.73432: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726882998.73489: variable '__network_team_connections_defined' from source: role '' defaults 28983 1726882998.73704: variable 'network_connections' from source: include params 28983 1726882998.73707: variable 'interface' from source: play vars 28983 1726882998.73763: variable 'interface' from source: play vars 28983 1726882998.73793: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 28983 1726882998.73797: when evaluation is False, skipping this task 28983 1726882998.73800: _execute() done 28983 1726882998.73803: dumping result to json 28983 1726882998.73808: done dumping result, returning 28983 1726882998.73815: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affe814-3a2d-b16d-c0a7-000000000736] 28983 1726882998.73826: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000736 28983 1726882998.73931: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000736 28983 1726882998.73934: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 28983 1726882998.74006: no more pending results, returning what we have 28983 1726882998.74010: results queue empty 28983 1726882998.74011: checking for any_errors_fatal 28983 1726882998.74023: done checking for any_errors_fatal 28983 1726882998.74024: checking for max_fail_percentage 28983 1726882998.74026: done checking for max_fail_percentage 28983 1726882998.74027: checking to see if all hosts have failed and the running result is not ok 28983 1726882998.74028: done checking to see if all hosts have failed 28983 1726882998.74029: getting the remaining hosts for this loop 28983 1726882998.74031: done getting the remaining hosts for this loop 28983 1726882998.74038: getting the next task for host managed_node2 28983 1726882998.74048: done getting next task for host managed_node2 28983 1726882998.74053: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 28983 1726882998.74059: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882998.74078: getting variables 28983 1726882998.74080: in VariableManager get_vars() 28983 1726882998.74116: Calling all_inventory to load vars for managed_node2 28983 1726882998.74120: Calling groups_inventory to load vars for managed_node2 28983 1726882998.74123: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882998.74132: Calling all_plugins_play to load vars for managed_node2 28983 1726882998.74143: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882998.74147: Calling groups_plugins_play to load vars for managed_node2 28983 1726882998.76616: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882998.79803: done with get_vars() 28983 1726882998.79836: done getting variables 28983 1726882998.79890: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:43:18 -0400 (0:00:00.118) 0:00:28.797 ****** 28983 1726882998.79921: entering _queue_task() for managed_node2/package 28983 1726882998.80191: worker is 1 (out of 1 available) 28983 1726882998.80207: exiting _queue_task() for managed_node2/package 28983 1726882998.80222: done queuing things up, now waiting for results queue to drain 28983 1726882998.80224: waiting for pending results... 28983 1726882998.80422: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 28983 1726882998.80548: in run() - task 0affe814-3a2d-b16d-c0a7-000000000737 28983 1726882998.80562: variable 'ansible_search_path' from source: unknown 28983 1726882998.80568: variable 'ansible_search_path' from source: unknown 28983 1726882998.80601: calling self._execute() 28983 1726882998.80687: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882998.80691: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882998.80701: variable 'omit' from source: magic vars 28983 1726882998.81023: variable 'ansible_distribution_major_version' from source: facts 28983 1726882998.81035: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882998.81200: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726882998.81426: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726882998.81469: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726882998.81500: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726882998.81561: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726882998.81654: variable 'network_packages' from source: role '' defaults 28983 1726882998.81741: variable '__network_provider_setup' from source: role '' defaults 28983 1726882998.81752: variable '__network_service_name_default_nm' from source: role '' defaults 28983 1726882998.81809: variable '__network_service_name_default_nm' from source: role '' defaults 28983 1726882998.81816: variable '__network_packages_default_nm' from source: role '' defaults 28983 1726882998.81876: variable '__network_packages_default_nm' from source: role '' defaults 28983 1726882998.82028: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726882998.83883: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726882998.83932: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726882998.83965: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726882998.83993: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726882998.84015: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726882998.84084: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726882998.84108: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726882998.84129: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726882998.84167: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726882998.84182: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726882998.84220: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726882998.84241: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726882998.84263: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726882998.84298: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726882998.84311: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726882998.84498: variable '__network_packages_default_gobject_packages' from source: role '' defaults 28983 1726882998.84602: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726882998.84624: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726882998.84647: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726882998.84712: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726882998.84715: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726882998.84766: variable 'ansible_python' from source: facts 28983 1726882998.84781: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 28983 1726882998.84852: variable '__network_wpa_supplicant_required' from source: role '' defaults 28983 1726882998.84918: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 28983 1726882998.85024: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726882998.85051: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726882998.85071: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726882998.85103: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726882998.85115: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726882998.85160: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726882998.85184: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726882998.85205: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726882998.85236: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726882998.85253: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726882998.85369: variable 'network_connections' from source: include params 28983 1726882998.85376: variable 'interface' from source: play vars 28983 1726882998.85456: variable 'interface' from source: play vars 28983 1726882998.85518: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726882998.85543: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726882998.85567: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726882998.85598: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726882998.85638: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726882998.85875: variable 'network_connections' from source: include params 28983 1726882998.85879: variable 'interface' from source: play vars 28983 1726882998.85963: variable 'interface' from source: play vars 28983 1726882998.86006: variable '__network_packages_default_wireless' from source: role '' defaults 28983 1726882998.86077: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726882998.86329: variable 'network_connections' from source: include params 28983 1726882998.86333: variable 'interface' from source: play vars 28983 1726882998.86392: variable 'interface' from source: play vars 28983 1726882998.86414: variable '__network_packages_default_team' from source: role '' defaults 28983 1726882998.86482: variable '__network_team_connections_defined' from source: role '' defaults 28983 1726882998.86732: variable 'network_connections' from source: include params 28983 1726882998.86738: variable 'interface' from source: play vars 28983 1726882998.86793: variable 'interface' from source: play vars 28983 1726882998.86850: variable '__network_service_name_default_initscripts' from source: role '' defaults 28983 1726882998.86906: variable '__network_service_name_default_initscripts' from source: role '' defaults 28983 1726882998.86915: variable '__network_packages_default_initscripts' from source: role '' defaults 28983 1726882998.86966: variable '__network_packages_default_initscripts' from source: role '' defaults 28983 1726882998.87153: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 28983 1726882998.87547: variable 'network_connections' from source: include params 28983 1726882998.87551: variable 'interface' from source: play vars 28983 1726882998.87605: variable 'interface' from source: play vars 28983 1726882998.87614: variable 'ansible_distribution' from source: facts 28983 1726882998.87618: variable '__network_rh_distros' from source: role '' defaults 28983 1726882998.87625: variable 'ansible_distribution_major_version' from source: facts 28983 1726882998.87647: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 28983 1726882998.87789: variable 'ansible_distribution' from source: facts 28983 1726882998.87793: variable '__network_rh_distros' from source: role '' defaults 28983 1726882998.87799: variable 'ansible_distribution_major_version' from source: facts 28983 1726882998.87805: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 28983 1726882998.87946: variable 'ansible_distribution' from source: facts 28983 1726882998.87950: variable '__network_rh_distros' from source: role '' defaults 28983 1726882998.87956: variable 'ansible_distribution_major_version' from source: facts 28983 1726882998.87989: variable 'network_provider' from source: set_fact 28983 1726882998.88004: variable 'ansible_facts' from source: unknown 28983 1726882998.88589: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 28983 1726882998.88593: when evaluation is False, skipping this task 28983 1726882998.88595: _execute() done 28983 1726882998.88598: dumping result to json 28983 1726882998.88604: done dumping result, returning 28983 1726882998.88612: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [0affe814-3a2d-b16d-c0a7-000000000737] 28983 1726882998.88617: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000737 28983 1726882998.88716: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000737 28983 1726882998.88719: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 28983 1726882998.88801: no more pending results, returning what we have 28983 1726882998.88805: results queue empty 28983 1726882998.88806: checking for any_errors_fatal 28983 1726882998.88813: done checking for any_errors_fatal 28983 1726882998.88814: checking for max_fail_percentage 28983 1726882998.88816: done checking for max_fail_percentage 28983 1726882998.88817: checking to see if all hosts have failed and the running result is not ok 28983 1726882998.88818: done checking to see if all hosts have failed 28983 1726882998.88819: getting the remaining hosts for this loop 28983 1726882998.88821: done getting the remaining hosts for this loop 28983 1726882998.88826: getting the next task for host managed_node2 28983 1726882998.88837: done getting next task for host managed_node2 28983 1726882998.88843: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 28983 1726882998.88849: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882998.88868: getting variables 28983 1726882998.88870: in VariableManager get_vars() 28983 1726882998.88911: Calling all_inventory to load vars for managed_node2 28983 1726882998.88914: Calling groups_inventory to load vars for managed_node2 28983 1726882998.88917: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882998.88927: Calling all_plugins_play to load vars for managed_node2 28983 1726882998.88930: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882998.88933: Calling groups_plugins_play to load vars for managed_node2 28983 1726882998.90369: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882998.91969: done with get_vars() 28983 1726882998.91995: done getting variables 28983 1726882998.92046: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:43:18 -0400 (0:00:00.121) 0:00:28.918 ****** 28983 1726882998.92076: entering _queue_task() for managed_node2/package 28983 1726882998.92317: worker is 1 (out of 1 available) 28983 1726882998.92332: exiting _queue_task() for managed_node2/package 28983 1726882998.92348: done queuing things up, now waiting for results queue to drain 28983 1726882998.92350: waiting for pending results... 28983 1726882998.92547: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 28983 1726882998.92667: in run() - task 0affe814-3a2d-b16d-c0a7-000000000738 28983 1726882998.92684: variable 'ansible_search_path' from source: unknown 28983 1726882998.92688: variable 'ansible_search_path' from source: unknown 28983 1726882998.92721: calling self._execute() 28983 1726882998.92810: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882998.92815: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882998.92823: variable 'omit' from source: magic vars 28983 1726882998.93144: variable 'ansible_distribution_major_version' from source: facts 28983 1726882998.93155: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882998.93260: variable 'network_state' from source: role '' defaults 28983 1726882998.93271: Evaluated conditional (network_state != {}): False 28983 1726882998.93277: when evaluation is False, skipping this task 28983 1726882998.93281: _execute() done 28983 1726882998.93284: dumping result to json 28983 1726882998.93287: done dumping result, returning 28983 1726882998.93294: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affe814-3a2d-b16d-c0a7-000000000738] 28983 1726882998.93300: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000738 28983 1726882998.93403: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000738 28983 1726882998.93406: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28983 1726882998.93465: no more pending results, returning what we have 28983 1726882998.93469: results queue empty 28983 1726882998.93470: checking for any_errors_fatal 28983 1726882998.93477: done checking for any_errors_fatal 28983 1726882998.93478: checking for max_fail_percentage 28983 1726882998.93480: done checking for max_fail_percentage 28983 1726882998.93481: checking to see if all hosts have failed and the running result is not ok 28983 1726882998.93482: done checking to see if all hosts have failed 28983 1726882998.93483: getting the remaining hosts for this loop 28983 1726882998.93485: done getting the remaining hosts for this loop 28983 1726882998.93488: getting the next task for host managed_node2 28983 1726882998.93497: done getting next task for host managed_node2 28983 1726882998.93501: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 28983 1726882998.93507: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882998.93532: getting variables 28983 1726882998.93535: in VariableManager get_vars() 28983 1726882998.93567: Calling all_inventory to load vars for managed_node2 28983 1726882998.93570: Calling groups_inventory to load vars for managed_node2 28983 1726882998.93575: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882998.93583: Calling all_plugins_play to load vars for managed_node2 28983 1726882998.93585: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882998.93588: Calling groups_plugins_play to load vars for managed_node2 28983 1726882998.94804: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882998.97085: done with get_vars() 28983 1726882998.97106: done getting variables 28983 1726882998.97158: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:43:18 -0400 (0:00:00.051) 0:00:28.969 ****** 28983 1726882998.97185: entering _queue_task() for managed_node2/package 28983 1726882998.97401: worker is 1 (out of 1 available) 28983 1726882998.97417: exiting _queue_task() for managed_node2/package 28983 1726882998.97432: done queuing things up, now waiting for results queue to drain 28983 1726882998.97436: waiting for pending results... 28983 1726882998.97631: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 28983 1726882998.97754: in run() - task 0affe814-3a2d-b16d-c0a7-000000000739 28983 1726882998.97773: variable 'ansible_search_path' from source: unknown 28983 1726882998.97777: variable 'ansible_search_path' from source: unknown 28983 1726882998.97807: calling self._execute() 28983 1726882998.97885: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882998.97896: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882998.97904: variable 'omit' from source: magic vars 28983 1726882998.98222: variable 'ansible_distribution_major_version' from source: facts 28983 1726882998.98231: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882998.98333: variable 'network_state' from source: role '' defaults 28983 1726882998.98344: Evaluated conditional (network_state != {}): False 28983 1726882998.98348: when evaluation is False, skipping this task 28983 1726882998.98352: _execute() done 28983 1726882998.98354: dumping result to json 28983 1726882998.98359: done dumping result, returning 28983 1726882998.98368: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affe814-3a2d-b16d-c0a7-000000000739] 28983 1726882998.98374: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000739 28983 1726882998.98477: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000739 28983 1726882998.98481: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28983 1726882998.98532: no more pending results, returning what we have 28983 1726882998.98537: results queue empty 28983 1726882998.98538: checking for any_errors_fatal 28983 1726882998.98545: done checking for any_errors_fatal 28983 1726882998.98546: checking for max_fail_percentage 28983 1726882998.98548: done checking for max_fail_percentage 28983 1726882998.98549: checking to see if all hosts have failed and the running result is not ok 28983 1726882998.98550: done checking to see if all hosts have failed 28983 1726882998.98551: getting the remaining hosts for this loop 28983 1726882998.98553: done getting the remaining hosts for this loop 28983 1726882998.98557: getting the next task for host managed_node2 28983 1726882998.98564: done getting next task for host managed_node2 28983 1726882998.98568: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 28983 1726882998.98574: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882998.98592: getting variables 28983 1726882998.98593: in VariableManager get_vars() 28983 1726882998.98625: Calling all_inventory to load vars for managed_node2 28983 1726882998.98627: Calling groups_inventory to load vars for managed_node2 28983 1726882998.98630: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882998.98645: Calling all_plugins_play to load vars for managed_node2 28983 1726882998.98648: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882998.98650: Calling groups_plugins_play to load vars for managed_node2 28983 1726882999.00873: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882999.03844: done with get_vars() 28983 1726882999.03879: done getting variables 28983 1726882999.03947: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:43:19 -0400 (0:00:00.067) 0:00:29.037 ****** 28983 1726882999.03988: entering _queue_task() for managed_node2/service 28983 1726882999.04310: worker is 1 (out of 1 available) 28983 1726882999.04325: exiting _queue_task() for managed_node2/service 28983 1726882999.04340: done queuing things up, now waiting for results queue to drain 28983 1726882999.04342: waiting for pending results... 28983 1726882999.04757: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 28983 1726882999.04828: in run() - task 0affe814-3a2d-b16d-c0a7-00000000073a 28983 1726882999.04858: variable 'ansible_search_path' from source: unknown 28983 1726882999.04867: variable 'ansible_search_path' from source: unknown 28983 1726882999.04911: calling self._execute() 28983 1726882999.05025: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882999.05041: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882999.05058: variable 'omit' from source: magic vars 28983 1726882999.05518: variable 'ansible_distribution_major_version' from source: facts 28983 1726882999.05540: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882999.05685: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726882999.05941: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726882999.08661: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726882999.09103: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726882999.09155: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726882999.09207: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726882999.09246: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726882999.09343: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726882999.09384: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726882999.09530: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726882999.09534: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726882999.09538: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726882999.09560: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726882999.09593: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726882999.09628: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726882999.09690: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726882999.09711: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726882999.09772: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726882999.09806: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726882999.09842: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726882999.09901: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726882999.09921: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726882999.10155: variable 'network_connections' from source: include params 28983 1726882999.10174: variable 'interface' from source: play vars 28983 1726882999.10266: variable 'interface' from source: play vars 28983 1726882999.10363: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726882999.10576: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726882999.10630: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726882999.10673: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726882999.10730: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726882999.10787: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726882999.10822: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726882999.10939: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726882999.10942: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726882999.10980: variable '__network_team_connections_defined' from source: role '' defaults 28983 1726882999.11318: variable 'network_connections' from source: include params 28983 1726882999.11329: variable 'interface' from source: play vars 28983 1726882999.11414: variable 'interface' from source: play vars 28983 1726882999.11457: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 28983 1726882999.11466: when evaluation is False, skipping this task 28983 1726882999.11473: _execute() done 28983 1726882999.11481: dumping result to json 28983 1726882999.11493: done dumping result, returning 28983 1726882999.11506: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affe814-3a2d-b16d-c0a7-00000000073a] 28983 1726882999.11515: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000073a skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 28983 1726882999.11691: no more pending results, returning what we have 28983 1726882999.11695: results queue empty 28983 1726882999.11696: checking for any_errors_fatal 28983 1726882999.11705: done checking for any_errors_fatal 28983 1726882999.11706: checking for max_fail_percentage 28983 1726882999.11708: done checking for max_fail_percentage 28983 1726882999.11709: checking to see if all hosts have failed and the running result is not ok 28983 1726882999.11710: done checking to see if all hosts have failed 28983 1726882999.11711: getting the remaining hosts for this loop 28983 1726882999.11713: done getting the remaining hosts for this loop 28983 1726882999.11719: getting the next task for host managed_node2 28983 1726882999.11730: done getting next task for host managed_node2 28983 1726882999.11737: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 28983 1726882999.11744: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882999.11765: getting variables 28983 1726882999.11767: in VariableManager get_vars() 28983 1726882999.11807: Calling all_inventory to load vars for managed_node2 28983 1726882999.11811: Calling groups_inventory to load vars for managed_node2 28983 1726882999.11814: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882999.11824: Calling all_plugins_play to load vars for managed_node2 28983 1726882999.11828: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882999.11832: Calling groups_plugins_play to load vars for managed_node2 28983 1726882999.12049: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000073a 28983 1726882999.12053: WORKER PROCESS EXITING 28983 1726882999.14554: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882999.17499: done with get_vars() 28983 1726882999.17533: done getting variables 28983 1726882999.17601: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:43:19 -0400 (0:00:00.136) 0:00:29.174 ****** 28983 1726882999.17641: entering _queue_task() for managed_node2/service 28983 1726882999.17964: worker is 1 (out of 1 available) 28983 1726882999.17977: exiting _queue_task() for managed_node2/service 28983 1726882999.17991: done queuing things up, now waiting for results queue to drain 28983 1726882999.17993: waiting for pending results... 28983 1726882999.18303: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 28983 1726882999.18640: in run() - task 0affe814-3a2d-b16d-c0a7-00000000073b 28983 1726882999.18644: variable 'ansible_search_path' from source: unknown 28983 1726882999.18647: variable 'ansible_search_path' from source: unknown 28983 1726882999.18649: calling self._execute() 28983 1726882999.18657: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882999.18670: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882999.18686: variable 'omit' from source: magic vars 28983 1726882999.19130: variable 'ansible_distribution_major_version' from source: facts 28983 1726882999.19151: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882999.19375: variable 'network_provider' from source: set_fact 28983 1726882999.19388: variable 'network_state' from source: role '' defaults 28983 1726882999.19403: Evaluated conditional (network_provider == "nm" or network_state != {}): True 28983 1726882999.19420: variable 'omit' from source: magic vars 28983 1726882999.19506: variable 'omit' from source: magic vars 28983 1726882999.19548: variable 'network_service_name' from source: role '' defaults 28983 1726882999.19636: variable 'network_service_name' from source: role '' defaults 28983 1726882999.19778: variable '__network_provider_setup' from source: role '' defaults 28983 1726882999.19790: variable '__network_service_name_default_nm' from source: role '' defaults 28983 1726882999.19873: variable '__network_service_name_default_nm' from source: role '' defaults 28983 1726882999.19887: variable '__network_packages_default_nm' from source: role '' defaults 28983 1726882999.19972: variable '__network_packages_default_nm' from source: role '' defaults 28983 1726882999.20287: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726882999.22851: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726882999.22948: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726882999.22994: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726882999.23048: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726882999.23153: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726882999.23185: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726882999.23224: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726882999.23265: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726882999.23322: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726882999.23346: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726882999.23410: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726882999.23445: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726882999.23484: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726882999.23540: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726882999.23561: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726882999.23877: variable '__network_packages_default_gobject_packages' from source: role '' defaults 28983 1726882999.24039: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726882999.24129: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726882999.24132: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726882999.24166: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726882999.24187: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726882999.24305: variable 'ansible_python' from source: facts 28983 1726882999.24327: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 28983 1726882999.24433: variable '__network_wpa_supplicant_required' from source: role '' defaults 28983 1726882999.24543: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 28983 1726882999.24718: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726882999.24755: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726882999.24840: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726882999.24852: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726882999.24873: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726882999.24942: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726882999.24985: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726882999.25026: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726882999.25083: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726882999.25211: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726882999.25294: variable 'network_connections' from source: include params 28983 1726882999.25307: variable 'interface' from source: play vars 28983 1726882999.25405: variable 'interface' from source: play vars 28983 1726882999.25542: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726882999.25788: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726882999.25856: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726882999.25914: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726882999.25973: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726882999.26052: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726882999.26097: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726882999.26142: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726882999.26198: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726882999.26250: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726882999.26841: variable 'network_connections' from source: include params 28983 1726882999.26844: variable 'interface' from source: play vars 28983 1726882999.26846: variable 'interface' from source: play vars 28983 1726882999.26849: variable '__network_packages_default_wireless' from source: role '' defaults 28983 1726882999.26908: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726882999.27315: variable 'network_connections' from source: include params 28983 1726882999.27327: variable 'interface' from source: play vars 28983 1726882999.27421: variable 'interface' from source: play vars 28983 1726882999.27459: variable '__network_packages_default_team' from source: role '' defaults 28983 1726882999.27568: variable '__network_team_connections_defined' from source: role '' defaults 28983 1726882999.27969: variable 'network_connections' from source: include params 28983 1726882999.27981: variable 'interface' from source: play vars 28983 1726882999.28074: variable 'interface' from source: play vars 28983 1726882999.28159: variable '__network_service_name_default_initscripts' from source: role '' defaults 28983 1726882999.28239: variable '__network_service_name_default_initscripts' from source: role '' defaults 28983 1726882999.28253: variable '__network_packages_default_initscripts' from source: role '' defaults 28983 1726882999.28337: variable '__network_packages_default_initscripts' from source: role '' defaults 28983 1726882999.28651: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 28983 1726882999.29343: variable 'network_connections' from source: include params 28983 1726882999.29360: variable 'interface' from source: play vars 28983 1726882999.29440: variable 'interface' from source: play vars 28983 1726882999.29461: variable 'ansible_distribution' from source: facts 28983 1726882999.29470: variable '__network_rh_distros' from source: role '' defaults 28983 1726882999.29482: variable 'ansible_distribution_major_version' from source: facts 28983 1726882999.29511: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 28983 1726882999.29758: variable 'ansible_distribution' from source: facts 28983 1726882999.29768: variable '__network_rh_distros' from source: role '' defaults 28983 1726882999.29778: variable 'ansible_distribution_major_version' from source: facts 28983 1726882999.29842: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 28983 1726882999.30029: variable 'ansible_distribution' from source: facts 28983 1726882999.30042: variable '__network_rh_distros' from source: role '' defaults 28983 1726882999.30055: variable 'ansible_distribution_major_version' from source: facts 28983 1726882999.30103: variable 'network_provider' from source: set_fact 28983 1726882999.30137: variable 'omit' from source: magic vars 28983 1726882999.30175: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726882999.30212: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726882999.30238: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726882999.30263: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882999.30388: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726882999.30392: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726882999.30395: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882999.30397: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882999.30467: Set connection var ansible_connection to ssh 28983 1726882999.30487: Set connection var ansible_shell_executable to /bin/sh 28983 1726882999.30508: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726882999.30523: Set connection var ansible_timeout to 10 28983 1726882999.30535: Set connection var ansible_pipelining to False 28983 1726882999.30543: Set connection var ansible_shell_type to sh 28983 1726882999.30572: variable 'ansible_shell_executable' from source: unknown 28983 1726882999.30581: variable 'ansible_connection' from source: unknown 28983 1726882999.30588: variable 'ansible_module_compression' from source: unknown 28983 1726882999.30595: variable 'ansible_shell_type' from source: unknown 28983 1726882999.30608: variable 'ansible_shell_executable' from source: unknown 28983 1726882999.30615: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882999.30623: variable 'ansible_pipelining' from source: unknown 28983 1726882999.30630: variable 'ansible_timeout' from source: unknown 28983 1726882999.30641: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882999.30766: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726882999.30787: variable 'omit' from source: magic vars 28983 1726882999.30801: starting attempt loop 28983 1726882999.30809: running the handler 28983 1726882999.30912: variable 'ansible_facts' from source: unknown 28983 1726882999.32494: _low_level_execute_command(): starting 28983 1726882999.32497: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726882999.33161: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726882999.33165: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882999.33208: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726882999.33225: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726882999.33246: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882999.33390: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882999.35219: stdout chunk (state=3): >>>/root <<< 28983 1726882999.35427: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882999.35430: stdout chunk (state=3): >>><<< 28983 1726882999.35432: stderr chunk (state=3): >>><<< 28983 1726882999.35563: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726882999.35567: _low_level_execute_command(): starting 28983 1726882999.35571: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882999.3545809-30105-105510814568565 `" && echo ansible-tmp-1726882999.3545809-30105-105510814568565="` echo /root/.ansible/tmp/ansible-tmp-1726882999.3545809-30105-105510814568565 `" ) && sleep 0' 28983 1726882999.36120: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726882999.36251: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726882999.36255: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726882999.36301: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882999.36366: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882999.38424: stdout chunk (state=3): >>>ansible-tmp-1726882999.3545809-30105-105510814568565=/root/.ansible/tmp/ansible-tmp-1726882999.3545809-30105-105510814568565 <<< 28983 1726882999.38627: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882999.38631: stdout chunk (state=3): >>><<< 28983 1726882999.38633: stderr chunk (state=3): >>><<< 28983 1726882999.38841: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882999.3545809-30105-105510814568565=/root/.ansible/tmp/ansible-tmp-1726882999.3545809-30105-105510814568565 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726882999.38845: variable 'ansible_module_compression' from source: unknown 28983 1726882999.38848: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 28983 1726882999.38850: variable 'ansible_facts' from source: unknown 28983 1726882999.39011: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882999.3545809-30105-105510814568565/AnsiballZ_systemd.py 28983 1726882999.39203: Sending initial data 28983 1726882999.39213: Sent initial data (156 bytes) 28983 1726882999.39829: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726882999.39943: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 28983 1726882999.39975: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726882999.39995: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882999.40096: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882999.41739: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 28983 1726882999.41789: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726882999.41837: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726882999.41919: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmppnq7xpnn /root/.ansible/tmp/ansible-tmp-1726882999.3545809-30105-105510814568565/AnsiballZ_systemd.py <<< 28983 1726882999.41923: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882999.3545809-30105-105510814568565/AnsiballZ_systemd.py" <<< 28983 1726882999.41979: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmppnq7xpnn" to remote "/root/.ansible/tmp/ansible-tmp-1726882999.3545809-30105-105510814568565/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882999.3545809-30105-105510814568565/AnsiballZ_systemd.py" <<< 28983 1726882999.44620: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882999.44684: stderr chunk (state=3): >>><<< 28983 1726882999.44687: stdout chunk (state=3): >>><<< 28983 1726882999.44709: done transferring module to remote 28983 1726882999.44718: _low_level_execute_command(): starting 28983 1726882999.44724: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882999.3545809-30105-105510814568565/ /root/.ansible/tmp/ansible-tmp-1726882999.3545809-30105-105510814568565/AnsiballZ_systemd.py && sleep 0' 28983 1726882999.45139: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882999.45174: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726882999.45177: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882999.45180: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration <<< 28983 1726882999.45184: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726882999.45187: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882999.45239: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726882999.45244: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882999.45316: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882999.47323: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882999.47368: stderr chunk (state=3): >>><<< 28983 1726882999.47371: stdout chunk (state=3): >>><<< 28983 1726882999.47386: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726882999.47389: _low_level_execute_command(): starting 28983 1726882999.47395: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882999.3545809-30105-105510814568565/AnsiballZ_systemd.py && sleep 0' 28983 1726882999.47791: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882999.47819: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882999.47823: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882999.47825: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726882999.47885: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726882999.47888: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882999.47966: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882999.82151: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3307", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ExecMainStartTimestampMonotonic": "237115079", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3307", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4837", "MemoryCurrent": "4444160", "MemoryAvailable": "infinity", "CPUUsageNSec": "1499755000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "<<< 28983 1726882999.82179: stdout chunk (state=3): >>>infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target NetworkManager-wait-online.service cloud-init.service network.target network.service multi-user.target", "After": "systemd-journald.socket dbus-broker.service dbus.socket system.slice cloud-init-local.service sysinit.target network-pre.target basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:40:53 EDT", "StateChangeTimestampMonotonic": "816797668", "InactiveExitTimestamp": "Fri 2024-09-20 21:31:13 EDT", "InactiveExitTimestampMonotonic": "237115438", "ActiveEnterTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ActiveEnterTimestampMonotonic": "237210316", "ActiveExitTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ActiveExitTimestampMonotonic": "237082173", "InactiveEnterTimestamp": "Fri 2024-09-20 21:31:13 EDT", "InactiveEnterTimestampMonotonic": "237109124", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ConditionTimestampMonotonic": "237109810", "AssertTimestamp": "Fri 2024-09-20 21:31:13 EDT", "AssertTimestampMonotonic": "237109813", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ac5f6302898c463683c4bdf580ab7e3e", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 28983 1726882999.84259: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882999.84355: stderr chunk (state=3): >>>Shared connection to 10.31.46.139 closed. <<< 28983 1726882999.84372: stderr chunk (state=3): >>><<< 28983 1726882999.84382: stdout chunk (state=3): >>><<< 28983 1726882999.84408: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3307", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ExecMainStartTimestampMonotonic": "237115079", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3307", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4837", "MemoryCurrent": "4444160", "MemoryAvailable": "infinity", "CPUUsageNSec": "1499755000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target NetworkManager-wait-online.service cloud-init.service network.target network.service multi-user.target", "After": "systemd-journald.socket dbus-broker.service dbus.socket system.slice cloud-init-local.service sysinit.target network-pre.target basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:40:53 EDT", "StateChangeTimestampMonotonic": "816797668", "InactiveExitTimestamp": "Fri 2024-09-20 21:31:13 EDT", "InactiveExitTimestampMonotonic": "237115438", "ActiveEnterTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ActiveEnterTimestampMonotonic": "237210316", "ActiveExitTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ActiveExitTimestampMonotonic": "237082173", "InactiveEnterTimestamp": "Fri 2024-09-20 21:31:13 EDT", "InactiveEnterTimestampMonotonic": "237109124", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ConditionTimestampMonotonic": "237109810", "AssertTimestamp": "Fri 2024-09-20 21:31:13 EDT", "AssertTimestampMonotonic": "237109813", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ac5f6302898c463683c4bdf580ab7e3e", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726882999.84719: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882999.3545809-30105-105510814568565/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726882999.84755: _low_level_execute_command(): starting 28983 1726882999.84767: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882999.3545809-30105-105510814568565/ > /dev/null 2>&1 && sleep 0' 28983 1726882999.85423: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726882999.85442: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726882999.85460: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726882999.85480: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726882999.85605: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726882999.85627: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726882999.85740: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726882999.87750: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726882999.87765: stdout chunk (state=3): >>><<< 28983 1726882999.87787: stderr chunk (state=3): >>><<< 28983 1726882999.87807: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726882999.87839: handler run complete 28983 1726882999.87924: attempt loop complete, returning result 28983 1726882999.87935: _execute() done 28983 1726882999.88139: dumping result to json 28983 1726882999.88143: done dumping result, returning 28983 1726882999.88145: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affe814-3a2d-b16d-c0a7-00000000073b] 28983 1726882999.88148: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000073b ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28983 1726882999.88608: no more pending results, returning what we have 28983 1726882999.88612: results queue empty 28983 1726882999.88613: checking for any_errors_fatal 28983 1726882999.88620: done checking for any_errors_fatal 28983 1726882999.88621: checking for max_fail_percentage 28983 1726882999.88623: done checking for max_fail_percentage 28983 1726882999.88624: checking to see if all hosts have failed and the running result is not ok 28983 1726882999.88625: done checking to see if all hosts have failed 28983 1726882999.88626: getting the remaining hosts for this loop 28983 1726882999.88628: done getting the remaining hosts for this loop 28983 1726882999.88841: getting the next task for host managed_node2 28983 1726882999.88851: done getting next task for host managed_node2 28983 1726882999.88856: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 28983 1726882999.88863: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726882999.88877: getting variables 28983 1726882999.88879: in VariableManager get_vars() 28983 1726882999.88913: Calling all_inventory to load vars for managed_node2 28983 1726882999.88917: Calling groups_inventory to load vars for managed_node2 28983 1726882999.88920: Calling all_plugins_inventory to load vars for managed_node2 28983 1726882999.88930: Calling all_plugins_play to load vars for managed_node2 28983 1726882999.88940: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726882999.88946: Calling groups_plugins_play to load vars for managed_node2 28983 1726882999.89561: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000073b 28983 1726882999.89565: WORKER PROCESS EXITING 28983 1726882999.91485: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726882999.95162: done with get_vars() 28983 1726882999.95200: done getting variables 28983 1726882999.95421: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:43:19 -0400 (0:00:00.778) 0:00:29.952 ****** 28983 1726882999.95468: entering _queue_task() for managed_node2/service 28983 1726882999.96393: worker is 1 (out of 1 available) 28983 1726882999.96414: exiting _queue_task() for managed_node2/service 28983 1726882999.96429: done queuing things up, now waiting for results queue to drain 28983 1726882999.96431: waiting for pending results... 28983 1726882999.96938: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 28983 1726882999.97550: in run() - task 0affe814-3a2d-b16d-c0a7-00000000073c 28983 1726882999.97568: variable 'ansible_search_path' from source: unknown 28983 1726882999.97572: variable 'ansible_search_path' from source: unknown 28983 1726882999.97646: calling self._execute() 28983 1726882999.97966: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726882999.97974: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726882999.97993: variable 'omit' from source: magic vars 28983 1726882999.98923: variable 'ansible_distribution_major_version' from source: facts 28983 1726882999.99146: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726882999.99313: variable 'network_provider' from source: set_fact 28983 1726882999.99321: Evaluated conditional (network_provider == "nm"): True 28983 1726882999.99660: variable '__network_wpa_supplicant_required' from source: role '' defaults 28983 1726882999.99981: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 28983 1726883000.00491: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883000.03566: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883000.03650: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883000.03693: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883000.03737: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883000.03775: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883000.03879: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883000.03924: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883000.03945: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883000.04000: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883000.04014: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883000.04076: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883000.04206: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883000.04210: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883000.04213: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883000.04216: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883000.04316: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883000.04320: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883000.04322: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883000.04369: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883000.04385: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883000.04612: variable 'network_connections' from source: include params 28983 1726883000.04626: variable 'interface' from source: play vars 28983 1726883000.04709: variable 'interface' from source: play vars 28983 1726883000.04893: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883000.05482: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883000.05486: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883000.05511: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883000.05696: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883000.05700: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883000.05702: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883000.05705: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883000.05739: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883000.05933: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883000.06341: variable 'network_connections' from source: include params 28983 1726883000.06344: variable 'interface' from source: play vars 28983 1726883000.06650: variable 'interface' from source: play vars 28983 1726883000.06698: Evaluated conditional (__network_wpa_supplicant_required): False 28983 1726883000.06702: when evaluation is False, skipping this task 28983 1726883000.06704: _execute() done 28983 1726883000.06710: dumping result to json 28983 1726883000.06715: done dumping result, returning 28983 1726883000.06729: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affe814-3a2d-b16d-c0a7-00000000073c] 28983 1726883000.06742: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000073c 28983 1726883000.07046: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000073c 28983 1726883000.07050: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 28983 1726883000.07103: no more pending results, returning what we have 28983 1726883000.07107: results queue empty 28983 1726883000.07108: checking for any_errors_fatal 28983 1726883000.07133: done checking for any_errors_fatal 28983 1726883000.07137: checking for max_fail_percentage 28983 1726883000.07140: done checking for max_fail_percentage 28983 1726883000.07141: checking to see if all hosts have failed and the running result is not ok 28983 1726883000.07142: done checking to see if all hosts have failed 28983 1726883000.07143: getting the remaining hosts for this loop 28983 1726883000.07144: done getting the remaining hosts for this loop 28983 1726883000.07149: getting the next task for host managed_node2 28983 1726883000.07158: done getting next task for host managed_node2 28983 1726883000.07168: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 28983 1726883000.07177: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883000.07196: getting variables 28983 1726883000.07198: in VariableManager get_vars() 28983 1726883000.07279: Calling all_inventory to load vars for managed_node2 28983 1726883000.07283: Calling groups_inventory to load vars for managed_node2 28983 1726883000.07286: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883000.07296: Calling all_plugins_play to load vars for managed_node2 28983 1726883000.07299: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883000.07303: Calling groups_plugins_play to load vars for managed_node2 28983 1726883000.12169: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883000.18590: done with get_vars() 28983 1726883000.18631: done getting variables 28983 1726883000.18823: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:43:20 -0400 (0:00:00.234) 0:00:30.187 ****** 28983 1726883000.18971: entering _queue_task() for managed_node2/service 28983 1726883000.19796: worker is 1 (out of 1 available) 28983 1726883000.19811: exiting _queue_task() for managed_node2/service 28983 1726883000.19829: done queuing things up, now waiting for results queue to drain 28983 1726883000.19831: waiting for pending results... 28983 1726883000.20166: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 28983 1726883000.20524: in run() - task 0affe814-3a2d-b16d-c0a7-00000000073d 28983 1726883000.20545: variable 'ansible_search_path' from source: unknown 28983 1726883000.20549: variable 'ansible_search_path' from source: unknown 28983 1726883000.20590: calling self._execute() 28983 1726883000.20956: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883000.20959: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883000.20963: variable 'omit' from source: magic vars 28983 1726883000.21882: variable 'ansible_distribution_major_version' from source: facts 28983 1726883000.21896: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883000.22150: variable 'network_provider' from source: set_fact 28983 1726883000.22155: Evaluated conditional (network_provider == "initscripts"): False 28983 1726883000.22241: when evaluation is False, skipping this task 28983 1726883000.22244: _execute() done 28983 1726883000.22246: dumping result to json 28983 1726883000.22249: done dumping result, returning 28983 1726883000.22252: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [0affe814-3a2d-b16d-c0a7-00000000073d] 28983 1726883000.22256: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000073d 28983 1726883000.22339: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000073d 28983 1726883000.22343: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28983 1726883000.22398: no more pending results, returning what we have 28983 1726883000.22403: results queue empty 28983 1726883000.22404: checking for any_errors_fatal 28983 1726883000.22418: done checking for any_errors_fatal 28983 1726883000.22419: checking for max_fail_percentage 28983 1726883000.22422: done checking for max_fail_percentage 28983 1726883000.22423: checking to see if all hosts have failed and the running result is not ok 28983 1726883000.22424: done checking to see if all hosts have failed 28983 1726883000.22425: getting the remaining hosts for this loop 28983 1726883000.22427: done getting the remaining hosts for this loop 28983 1726883000.22433: getting the next task for host managed_node2 28983 1726883000.22444: done getting next task for host managed_node2 28983 1726883000.22449: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 28983 1726883000.22457: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883000.22485: getting variables 28983 1726883000.22487: in VariableManager get_vars() 28983 1726883000.22530: Calling all_inventory to load vars for managed_node2 28983 1726883000.22650: Calling groups_inventory to load vars for managed_node2 28983 1726883000.22654: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883000.22667: Calling all_plugins_play to load vars for managed_node2 28983 1726883000.22671: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883000.22677: Calling groups_plugins_play to load vars for managed_node2 28983 1726883000.27642: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883000.33987: done with get_vars() 28983 1726883000.34028: done getting variables 28983 1726883000.34302: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:43:20 -0400 (0:00:00.153) 0:00:30.341 ****** 28983 1726883000.34350: entering _queue_task() for managed_node2/copy 28983 1726883000.35030: worker is 1 (out of 1 available) 28983 1726883000.35046: exiting _queue_task() for managed_node2/copy 28983 1726883000.35063: done queuing things up, now waiting for results queue to drain 28983 1726883000.35065: waiting for pending results... 28983 1726883000.35856: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 28983 1726883000.35863: in run() - task 0affe814-3a2d-b16d-c0a7-00000000073e 28983 1726883000.35867: variable 'ansible_search_path' from source: unknown 28983 1726883000.35870: variable 'ansible_search_path' from source: unknown 28983 1726883000.35875: calling self._execute() 28983 1726883000.35879: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883000.35883: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883000.35886: variable 'omit' from source: magic vars 28983 1726883000.36340: variable 'ansible_distribution_major_version' from source: facts 28983 1726883000.36344: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883000.36475: variable 'network_provider' from source: set_fact 28983 1726883000.36479: Evaluated conditional (network_provider == "initscripts"): False 28983 1726883000.36483: when evaluation is False, skipping this task 28983 1726883000.36486: _execute() done 28983 1726883000.36496: dumping result to json 28983 1726883000.36500: done dumping result, returning 28983 1726883000.36506: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affe814-3a2d-b16d-c0a7-00000000073e] 28983 1726883000.36513: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000073e skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 28983 1726883000.36827: no more pending results, returning what we have 28983 1726883000.36830: results queue empty 28983 1726883000.36831: checking for any_errors_fatal 28983 1726883000.36839: done checking for any_errors_fatal 28983 1726883000.36840: checking for max_fail_percentage 28983 1726883000.36842: done checking for max_fail_percentage 28983 1726883000.36843: checking to see if all hosts have failed and the running result is not ok 28983 1726883000.36844: done checking to see if all hosts have failed 28983 1726883000.36845: getting the remaining hosts for this loop 28983 1726883000.36846: done getting the remaining hosts for this loop 28983 1726883000.36850: getting the next task for host managed_node2 28983 1726883000.36858: done getting next task for host managed_node2 28983 1726883000.36863: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 28983 1726883000.36869: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883000.36890: getting variables 28983 1726883000.36891: in VariableManager get_vars() 28983 1726883000.36924: Calling all_inventory to load vars for managed_node2 28983 1726883000.36928: Calling groups_inventory to load vars for managed_node2 28983 1726883000.36930: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883000.36942: Calling all_plugins_play to load vars for managed_node2 28983 1726883000.36946: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883000.36950: Calling groups_plugins_play to load vars for managed_node2 28983 1726883000.37550: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000073e 28983 1726883000.37554: WORKER PROCESS EXITING 28983 1726883000.40994: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883000.47451: done with get_vars() 28983 1726883000.47611: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:43:20 -0400 (0:00:00.134) 0:00:30.476 ****** 28983 1726883000.47843: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 28983 1726883000.48624: worker is 1 (out of 1 available) 28983 1726883000.48639: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 28983 1726883000.48654: done queuing things up, now waiting for results queue to drain 28983 1726883000.48656: waiting for pending results... 28983 1726883000.49152: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 28983 1726883000.49513: in run() - task 0affe814-3a2d-b16d-c0a7-00000000073f 28983 1726883000.49529: variable 'ansible_search_path' from source: unknown 28983 1726883000.49536: variable 'ansible_search_path' from source: unknown 28983 1726883000.49575: calling self._execute() 28983 1726883000.49979: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883000.49987: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883000.50000: variable 'omit' from source: magic vars 28983 1726883000.50882: variable 'ansible_distribution_major_version' from source: facts 28983 1726883000.50886: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883000.50889: variable 'omit' from source: magic vars 28983 1726883000.51156: variable 'omit' from source: magic vars 28983 1726883000.51461: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883000.56424: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883000.56503: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883000.56750: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883000.56787: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883000.56818: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883000.56914: variable 'network_provider' from source: set_fact 28983 1726883000.57406: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883000.57412: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883000.57566: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883000.57622: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883000.57636: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883000.57739: variable 'omit' from source: magic vars 28983 1726883000.58061: variable 'omit' from source: magic vars 28983 1726883000.58392: variable 'network_connections' from source: include params 28983 1726883000.58407: variable 'interface' from source: play vars 28983 1726883000.58493: variable 'interface' from source: play vars 28983 1726883000.58878: variable 'omit' from source: magic vars 28983 1726883000.58887: variable '__lsr_ansible_managed' from source: task vars 28983 1726883000.59168: variable '__lsr_ansible_managed' from source: task vars 28983 1726883000.59732: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 28983 1726883000.59949: Loaded config def from plugin (lookup/template) 28983 1726883000.59953: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 28983 1726883000.59985: File lookup term: get_ansible_managed.j2 28983 1726883000.59989: variable 'ansible_search_path' from source: unknown 28983 1726883000.59992: evaluation_path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 28983 1726883000.60012: search_path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 28983 1726883000.60029: variable 'ansible_search_path' from source: unknown 28983 1726883000.80157: variable 'ansible_managed' from source: unknown 28983 1726883000.80590: variable 'omit' from source: magic vars 28983 1726883000.81039: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883000.81043: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883000.81046: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883000.81049: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883000.81051: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883000.81053: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883000.81055: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883000.81058: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883000.81369: Set connection var ansible_connection to ssh 28983 1726883000.81389: Set connection var ansible_shell_executable to /bin/sh 28983 1726883000.81406: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883000.81551: Set connection var ansible_timeout to 10 28983 1726883000.81563: Set connection var ansible_pipelining to False 28983 1726883000.81571: Set connection var ansible_shell_type to sh 28983 1726883000.81604: variable 'ansible_shell_executable' from source: unknown 28983 1726883000.81613: variable 'ansible_connection' from source: unknown 28983 1726883000.81940: variable 'ansible_module_compression' from source: unknown 28983 1726883000.81943: variable 'ansible_shell_type' from source: unknown 28983 1726883000.81946: variable 'ansible_shell_executable' from source: unknown 28983 1726883000.81948: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883000.81950: variable 'ansible_pipelining' from source: unknown 28983 1726883000.81952: variable 'ansible_timeout' from source: unknown 28983 1726883000.81954: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883000.81957: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28983 1726883000.81967: variable 'omit' from source: magic vars 28983 1726883000.82151: starting attempt loop 28983 1726883000.82160: running the handler 28983 1726883000.82181: _low_level_execute_command(): starting 28983 1726883000.82194: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726883000.83651: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883000.83706: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883000.83727: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883000.83858: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883000.84058: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883000.85838: stdout chunk (state=3): >>>/root <<< 28983 1726883000.86009: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883000.86021: stdout chunk (state=3): >>><<< 28983 1726883000.86036: stderr chunk (state=3): >>><<< 28983 1726883000.86356: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883000.86360: _low_level_execute_command(): starting 28983 1726883000.86363: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726883000.862577-30163-191801234903518 `" && echo ansible-tmp-1726883000.862577-30163-191801234903518="` echo /root/.ansible/tmp/ansible-tmp-1726883000.862577-30163-191801234903518 `" ) && sleep 0' 28983 1726883000.87350: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883000.87572: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883000.87598: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883000.87700: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883000.89743: stdout chunk (state=3): >>>ansible-tmp-1726883000.862577-30163-191801234903518=/root/.ansible/tmp/ansible-tmp-1726883000.862577-30163-191801234903518 <<< 28983 1726883000.89860: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883000.90159: stderr chunk (state=3): >>><<< 28983 1726883000.90163: stdout chunk (state=3): >>><<< 28983 1726883000.90339: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726883000.862577-30163-191801234903518=/root/.ansible/tmp/ansible-tmp-1726883000.862577-30163-191801234903518 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883000.90343: variable 'ansible_module_compression' from source: unknown 28983 1726883000.90345: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 28983 1726883000.90348: variable 'ansible_facts' from source: unknown 28983 1726883000.90655: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726883000.862577-30163-191801234903518/AnsiballZ_network_connections.py 28983 1726883000.91006: Sending initial data 28983 1726883000.91009: Sent initial data (167 bytes) 28983 1726883000.92351: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883000.92384: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883000.92554: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883000.92654: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883000.94317: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726883000.94385: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726883000.94461: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpk8uwvfg2 /root/.ansible/tmp/ansible-tmp-1726883000.862577-30163-191801234903518/AnsiballZ_network_connections.py <<< 28983 1726883000.94465: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726883000.862577-30163-191801234903518/AnsiballZ_network_connections.py" <<< 28983 1726883000.94542: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpk8uwvfg2" to remote "/root/.ansible/tmp/ansible-tmp-1726883000.862577-30163-191801234903518/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726883000.862577-30163-191801234903518/AnsiballZ_network_connections.py" <<< 28983 1726883000.97985: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883000.98051: stderr chunk (state=3): >>><<< 28983 1726883000.98055: stdout chunk (state=3): >>><<< 28983 1726883000.98088: done transferring module to remote 28983 1726883000.98107: _low_level_execute_command(): starting 28983 1726883000.98113: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726883000.862577-30163-191801234903518/ /root/.ansible/tmp/ansible-tmp-1726883000.862577-30163-191801234903518/AnsiballZ_network_connections.py && sleep 0' 28983 1726883000.99436: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883000.99550: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883000.99618: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883000.99740: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883000.99754: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883000.99854: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883001.01939: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883001.01943: stderr chunk (state=3): >>><<< 28983 1726883001.01946: stdout chunk (state=3): >>><<< 28983 1726883001.01949: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883001.01951: _low_level_execute_command(): starting 28983 1726883001.01954: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726883000.862577-30163-191801234903518/AnsiballZ_network_connections.py && sleep 0' 28983 1726883001.03123: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883001.03164: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883001.03177: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883001.03304: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883001.03413: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883001.03463: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883001.03561: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883001.32870: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 1a065f38-1816-45bb-8e19-5c41e45c0397\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "autoconnect": false, "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "autoconnect": false, "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 28983 1726883001.34976: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. <<< 28983 1726883001.34984: stdout chunk (state=3): >>><<< 28983 1726883001.34994: stderr chunk (state=3): >>><<< 28983 1726883001.35018: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 1a065f38-1816-45bb-8e19-5c41e45c0397\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "autoconnect": false, "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "autoconnect": false, "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726883001.35241: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'autoconnect': False, 'persistent_state': 'present', 'type': 'bridge', 'ip': {'dhcp4': False, 'auto6': False}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726883000.862577-30163-191801234903518/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726883001.35248: _low_level_execute_command(): starting 28983 1726883001.35254: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726883000.862577-30163-191801234903518/ > /dev/null 2>&1 && sleep 0' 28983 1726883001.36740: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883001.36785: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883001.36800: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883001.36951: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883001.37023: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883001.37122: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883001.37233: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883001.39249: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883001.39260: stdout chunk (state=3): >>><<< 28983 1726883001.39282: stderr chunk (state=3): >>><<< 28983 1726883001.39305: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883001.39318: handler run complete 28983 1726883001.39389: attempt loop complete, returning result 28983 1726883001.39608: _execute() done 28983 1726883001.39611: dumping result to json 28983 1726883001.39613: done dumping result, returning 28983 1726883001.39615: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affe814-3a2d-b16d-c0a7-00000000073f] 28983 1726883001.39617: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000073f changed: [managed_node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": false, "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 1a065f38-1816-45bb-8e19-5c41e45c0397 28983 1726883001.39942: no more pending results, returning what we have 28983 1726883001.39946: results queue empty 28983 1726883001.39947: checking for any_errors_fatal 28983 1726883001.39959: done checking for any_errors_fatal 28983 1726883001.39960: checking for max_fail_percentage 28983 1726883001.39962: done checking for max_fail_percentage 28983 1726883001.39963: checking to see if all hosts have failed and the running result is not ok 28983 1726883001.39964: done checking to see if all hosts have failed 28983 1726883001.39965: getting the remaining hosts for this loop 28983 1726883001.39967: done getting the remaining hosts for this loop 28983 1726883001.39975: getting the next task for host managed_node2 28983 1726883001.39984: done getting next task for host managed_node2 28983 1726883001.39988: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 28983 1726883001.39993: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883001.40454: getting variables 28983 1726883001.40457: in VariableManager get_vars() 28983 1726883001.40500: Calling all_inventory to load vars for managed_node2 28983 1726883001.40504: Calling groups_inventory to load vars for managed_node2 28983 1726883001.40507: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883001.40518: Calling all_plugins_play to load vars for managed_node2 28983 1726883001.40522: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883001.40526: Calling groups_plugins_play to load vars for managed_node2 28983 1726883001.41148: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000073f 28983 1726883001.41152: WORKER PROCESS EXITING 28983 1726883001.45665: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883001.52184: done with get_vars() 28983 1726883001.52229: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:43:21 -0400 (0:00:01.048) 0:00:31.524 ****** 28983 1726883001.52648: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 28983 1726883001.53494: worker is 1 (out of 1 available) 28983 1726883001.53508: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 28983 1726883001.53523: done queuing things up, now waiting for results queue to drain 28983 1726883001.53525: waiting for pending results... 28983 1726883001.54095: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 28983 1726883001.54586: in run() - task 0affe814-3a2d-b16d-c0a7-000000000740 28983 1726883001.54644: variable 'ansible_search_path' from source: unknown 28983 1726883001.54659: variable 'ansible_search_path' from source: unknown 28983 1726883001.54866: calling self._execute() 28983 1726883001.55018: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883001.55031: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883001.55050: variable 'omit' from source: magic vars 28983 1726883001.56055: variable 'ansible_distribution_major_version' from source: facts 28983 1726883001.56263: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883001.56458: variable 'network_state' from source: role '' defaults 28983 1726883001.56526: Evaluated conditional (network_state != {}): False 28983 1726883001.56537: when evaluation is False, skipping this task 28983 1726883001.56619: _execute() done 28983 1726883001.56622: dumping result to json 28983 1726883001.56626: done dumping result, returning 28983 1726883001.56629: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [0affe814-3a2d-b16d-c0a7-000000000740] 28983 1726883001.56643: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000740 28983 1726883001.57052: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000740 28983 1726883001.57056: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28983 1726883001.57127: no more pending results, returning what we have 28983 1726883001.57132: results queue empty 28983 1726883001.57136: checking for any_errors_fatal 28983 1726883001.57152: done checking for any_errors_fatal 28983 1726883001.57158: checking for max_fail_percentage 28983 1726883001.57161: done checking for max_fail_percentage 28983 1726883001.57162: checking to see if all hosts have failed and the running result is not ok 28983 1726883001.57163: done checking to see if all hosts have failed 28983 1726883001.57168: getting the remaining hosts for this loop 28983 1726883001.57171: done getting the remaining hosts for this loop 28983 1726883001.57178: getting the next task for host managed_node2 28983 1726883001.57189: done getting next task for host managed_node2 28983 1726883001.57193: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 28983 1726883001.57201: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883001.57224: getting variables 28983 1726883001.57226: in VariableManager get_vars() 28983 1726883001.57521: Calling all_inventory to load vars for managed_node2 28983 1726883001.57525: Calling groups_inventory to load vars for managed_node2 28983 1726883001.57528: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883001.57540: Calling all_plugins_play to load vars for managed_node2 28983 1726883001.57544: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883001.57549: Calling groups_plugins_play to load vars for managed_node2 28983 1726883001.62795: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883001.69318: done with get_vars() 28983 1726883001.69427: done getting variables 28983 1726883001.69580: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:43:21 -0400 (0:00:00.169) 0:00:31.693 ****** 28983 1726883001.69622: entering _queue_task() for managed_node2/debug 28983 1726883001.70555: worker is 1 (out of 1 available) 28983 1726883001.70568: exiting _queue_task() for managed_node2/debug 28983 1726883001.70582: done queuing things up, now waiting for results queue to drain 28983 1726883001.70584: waiting for pending results... 28983 1726883001.71224: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 28983 1726883001.71646: in run() - task 0affe814-3a2d-b16d-c0a7-000000000741 28983 1726883001.71650: variable 'ansible_search_path' from source: unknown 28983 1726883001.71655: variable 'ansible_search_path' from source: unknown 28983 1726883001.71658: calling self._execute() 28983 1726883001.71820: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883001.71893: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883001.72080: variable 'omit' from source: magic vars 28983 1726883001.72891: variable 'ansible_distribution_major_version' from source: facts 28983 1726883001.72958: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883001.72980: variable 'omit' from source: magic vars 28983 1726883001.73181: variable 'omit' from source: magic vars 28983 1726883001.73278: variable 'omit' from source: magic vars 28983 1726883001.73369: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883001.73604: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883001.73607: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883001.73610: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883001.73613: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883001.73748: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883001.73758: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883001.73768: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883001.74020: Set connection var ansible_connection to ssh 28983 1726883001.74084: Set connection var ansible_shell_executable to /bin/sh 28983 1726883001.74168: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883001.74186: Set connection var ansible_timeout to 10 28983 1726883001.74200: Set connection var ansible_pipelining to False 28983 1726883001.74290: Set connection var ansible_shell_type to sh 28983 1726883001.74303: variable 'ansible_shell_executable' from source: unknown 28983 1726883001.74312: variable 'ansible_connection' from source: unknown 28983 1726883001.74387: variable 'ansible_module_compression' from source: unknown 28983 1726883001.74390: variable 'ansible_shell_type' from source: unknown 28983 1726883001.74468: variable 'ansible_shell_executable' from source: unknown 28983 1726883001.74474: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883001.74476: variable 'ansible_pipelining' from source: unknown 28983 1726883001.74478: variable 'ansible_timeout' from source: unknown 28983 1726883001.74481: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883001.74929: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883001.74933: variable 'omit' from source: magic vars 28983 1726883001.74939: starting attempt loop 28983 1726883001.74942: running the handler 28983 1726883001.75199: variable '__network_connections_result' from source: set_fact 28983 1726883001.75316: handler run complete 28983 1726883001.75392: attempt loop complete, returning result 28983 1726883001.75586: _execute() done 28983 1726883001.75589: dumping result to json 28983 1726883001.75592: done dumping result, returning 28983 1726883001.75595: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affe814-3a2d-b16d-c0a7-000000000741] 28983 1726883001.75597: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000741 28983 1726883001.75669: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000741 28983 1726883001.75675: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 1a065f38-1816-45bb-8e19-5c41e45c0397" ] } 28983 1726883001.75765: no more pending results, returning what we have 28983 1726883001.75769: results queue empty 28983 1726883001.75770: checking for any_errors_fatal 28983 1726883001.75781: done checking for any_errors_fatal 28983 1726883001.75782: checking for max_fail_percentage 28983 1726883001.75784: done checking for max_fail_percentage 28983 1726883001.75785: checking to see if all hosts have failed and the running result is not ok 28983 1726883001.75786: done checking to see if all hosts have failed 28983 1726883001.75787: getting the remaining hosts for this loop 28983 1726883001.75789: done getting the remaining hosts for this loop 28983 1726883001.75940: getting the next task for host managed_node2 28983 1726883001.75951: done getting next task for host managed_node2 28983 1726883001.75957: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 28983 1726883001.75963: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883001.75980: getting variables 28983 1726883001.75981: in VariableManager get_vars() 28983 1726883001.76347: Calling all_inventory to load vars for managed_node2 28983 1726883001.76350: Calling groups_inventory to load vars for managed_node2 28983 1726883001.76353: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883001.76362: Calling all_plugins_play to load vars for managed_node2 28983 1726883001.76366: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883001.76370: Calling groups_plugins_play to load vars for managed_node2 28983 1726883001.81299: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883001.89052: done with get_vars() 28983 1726883001.89094: done getting variables 28983 1726883001.89348: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:43:21 -0400 (0:00:00.197) 0:00:31.891 ****** 28983 1726883001.89398: entering _queue_task() for managed_node2/debug 28983 1726883001.90512: worker is 1 (out of 1 available) 28983 1726883001.90527: exiting _queue_task() for managed_node2/debug 28983 1726883001.90542: done queuing things up, now waiting for results queue to drain 28983 1726883001.90620: waiting for pending results... 28983 1726883001.91298: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 28983 1726883001.91510: in run() - task 0affe814-3a2d-b16d-c0a7-000000000742 28983 1726883001.91537: variable 'ansible_search_path' from source: unknown 28983 1726883001.91719: variable 'ansible_search_path' from source: unknown 28983 1726883001.91723: calling self._execute() 28983 1726883001.91888: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883001.91959: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883001.91981: variable 'omit' from source: magic vars 28983 1726883001.92985: variable 'ansible_distribution_major_version' from source: facts 28983 1726883001.93010: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883001.93024: variable 'omit' from source: magic vars 28983 1726883001.93243: variable 'omit' from source: magic vars 28983 1726883001.93396: variable 'omit' from source: magic vars 28983 1726883001.93501: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883001.93556: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883001.93585: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883001.93610: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883001.93626: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883001.93678: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883001.93690: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883001.93699: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883001.93832: Set connection var ansible_connection to ssh 28983 1726883001.93862: Set connection var ansible_shell_executable to /bin/sh 28983 1726883001.93881: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883001.93898: Set connection var ansible_timeout to 10 28983 1726883001.93909: Set connection var ansible_pipelining to False 28983 1726883001.93917: Set connection var ansible_shell_type to sh 28983 1726883001.93949: variable 'ansible_shell_executable' from source: unknown 28983 1726883001.93969: variable 'ansible_connection' from source: unknown 28983 1726883001.93975: variable 'ansible_module_compression' from source: unknown 28983 1726883001.94040: variable 'ansible_shell_type' from source: unknown 28983 1726883001.94043: variable 'ansible_shell_executable' from source: unknown 28983 1726883001.94045: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883001.94047: variable 'ansible_pipelining' from source: unknown 28983 1726883001.94049: variable 'ansible_timeout' from source: unknown 28983 1726883001.94052: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883001.94204: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883001.94225: variable 'omit' from source: magic vars 28983 1726883001.94239: starting attempt loop 28983 1726883001.94247: running the handler 28983 1726883001.94315: variable '__network_connections_result' from source: set_fact 28983 1726883001.94421: variable '__network_connections_result' from source: set_fact 28983 1726883001.94596: handler run complete 28983 1726883001.94720: attempt loop complete, returning result 28983 1726883001.94723: _execute() done 28983 1726883001.94730: dumping result to json 28983 1726883001.94733: done dumping result, returning 28983 1726883001.94737: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affe814-3a2d-b16d-c0a7-000000000742] 28983 1726883001.94739: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000742 ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": false, "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 1a065f38-1816-45bb-8e19-5c41e45c0397\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 1a065f38-1816-45bb-8e19-5c41e45c0397" ] } } 28983 1726883001.94942: no more pending results, returning what we have 28983 1726883001.94946: results queue empty 28983 1726883001.94947: checking for any_errors_fatal 28983 1726883001.94957: done checking for any_errors_fatal 28983 1726883001.94958: checking for max_fail_percentage 28983 1726883001.94960: done checking for max_fail_percentage 28983 1726883001.94961: checking to see if all hosts have failed and the running result is not ok 28983 1726883001.94962: done checking to see if all hosts have failed 28983 1726883001.94963: getting the remaining hosts for this loop 28983 1726883001.94966: done getting the remaining hosts for this loop 28983 1726883001.94970: getting the next task for host managed_node2 28983 1726883001.94982: done getting next task for host managed_node2 28983 1726883001.94987: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 28983 1726883001.94993: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883001.95006: getting variables 28983 1726883001.95008: in VariableManager get_vars() 28983 1726883001.95263: Calling all_inventory to load vars for managed_node2 28983 1726883001.95267: Calling groups_inventory to load vars for managed_node2 28983 1726883001.95278: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883001.95289: Calling all_plugins_play to load vars for managed_node2 28983 1726883001.95293: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883001.95297: Calling groups_plugins_play to load vars for managed_node2 28983 1726883001.95879: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000742 28983 1726883001.95882: WORKER PROCESS EXITING 28983 1726883001.97696: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883002.01007: done with get_vars() 28983 1726883002.01043: done getting variables 28983 1726883002.01116: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:43:22 -0400 (0:00:00.117) 0:00:32.009 ****** 28983 1726883002.01158: entering _queue_task() for managed_node2/debug 28983 1726883002.01574: worker is 1 (out of 1 available) 28983 1726883002.01587: exiting _queue_task() for managed_node2/debug 28983 1726883002.01600: done queuing things up, now waiting for results queue to drain 28983 1726883002.01601: waiting for pending results... 28983 1726883002.01859: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 28983 1726883002.02099: in run() - task 0affe814-3a2d-b16d-c0a7-000000000743 28983 1726883002.02121: variable 'ansible_search_path' from source: unknown 28983 1726883002.02180: variable 'ansible_search_path' from source: unknown 28983 1726883002.02236: calling self._execute() 28983 1726883002.02350: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883002.02364: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883002.02385: variable 'omit' from source: magic vars 28983 1726883002.03039: variable 'ansible_distribution_major_version' from source: facts 28983 1726883002.03042: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883002.03045: variable 'network_state' from source: role '' defaults 28983 1726883002.03047: Evaluated conditional (network_state != {}): False 28983 1726883002.03049: when evaluation is False, skipping this task 28983 1726883002.03051: _execute() done 28983 1726883002.03053: dumping result to json 28983 1726883002.03055: done dumping result, returning 28983 1726883002.03058: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affe814-3a2d-b16d-c0a7-000000000743] 28983 1726883002.03068: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000743 skipping: [managed_node2] => { "false_condition": "network_state != {}" } 28983 1726883002.03242: no more pending results, returning what we have 28983 1726883002.03248: results queue empty 28983 1726883002.03249: checking for any_errors_fatal 28983 1726883002.03260: done checking for any_errors_fatal 28983 1726883002.03261: checking for max_fail_percentage 28983 1726883002.03264: done checking for max_fail_percentage 28983 1726883002.03265: checking to see if all hosts have failed and the running result is not ok 28983 1726883002.03266: done checking to see if all hosts have failed 28983 1726883002.03267: getting the remaining hosts for this loop 28983 1726883002.03269: done getting the remaining hosts for this loop 28983 1726883002.03276: getting the next task for host managed_node2 28983 1726883002.03287: done getting next task for host managed_node2 28983 1726883002.03293: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 28983 1726883002.03301: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883002.03323: getting variables 28983 1726883002.03325: in VariableManager get_vars() 28983 1726883002.03376: Calling all_inventory to load vars for managed_node2 28983 1726883002.03380: Calling groups_inventory to load vars for managed_node2 28983 1726883002.03383: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883002.03395: Calling all_plugins_play to load vars for managed_node2 28983 1726883002.03399: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883002.03403: Calling groups_plugins_play to load vars for managed_node2 28983 1726883002.04250: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000743 28983 1726883002.04254: WORKER PROCESS EXITING 28983 1726883002.05983: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883002.09692: done with get_vars() 28983 1726883002.09729: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:43:22 -0400 (0:00:00.086) 0:00:32.096 ****** 28983 1726883002.09852: entering _queue_task() for managed_node2/ping 28983 1726883002.10175: worker is 1 (out of 1 available) 28983 1726883002.10190: exiting _queue_task() for managed_node2/ping 28983 1726883002.10204: done queuing things up, now waiting for results queue to drain 28983 1726883002.10206: waiting for pending results... 28983 1726883002.10611: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 28983 1726883002.10901: in run() - task 0affe814-3a2d-b16d-c0a7-000000000744 28983 1726883002.10978: variable 'ansible_search_path' from source: unknown 28983 1726883002.10988: variable 'ansible_search_path' from source: unknown 28983 1726883002.11091: calling self._execute() 28983 1726883002.11409: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883002.11427: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883002.11449: variable 'omit' from source: magic vars 28983 1726883002.12277: variable 'ansible_distribution_major_version' from source: facts 28983 1726883002.12301: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883002.12312: variable 'omit' from source: magic vars 28983 1726883002.12403: variable 'omit' from source: magic vars 28983 1726883002.12449: variable 'omit' from source: magic vars 28983 1726883002.12502: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883002.12555: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883002.12584: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883002.12610: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883002.12632: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883002.12678: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883002.12688: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883002.12728: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883002.12826: Set connection var ansible_connection to ssh 28983 1726883002.12852: Set connection var ansible_shell_executable to /bin/sh 28983 1726883002.12869: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883002.12886: Set connection var ansible_timeout to 10 28983 1726883002.12899: Set connection var ansible_pipelining to False 28983 1726883002.12941: Set connection var ansible_shell_type to sh 28983 1726883002.12944: variable 'ansible_shell_executable' from source: unknown 28983 1726883002.12949: variable 'ansible_connection' from source: unknown 28983 1726883002.12958: variable 'ansible_module_compression' from source: unknown 28983 1726883002.12966: variable 'ansible_shell_type' from source: unknown 28983 1726883002.12974: variable 'ansible_shell_executable' from source: unknown 28983 1726883002.12982: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883002.12991: variable 'ansible_pipelining' from source: unknown 28983 1726883002.13040: variable 'ansible_timeout' from source: unknown 28983 1726883002.13043: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883002.13249: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28983 1726883002.13274: variable 'omit' from source: magic vars 28983 1726883002.13288: starting attempt loop 28983 1726883002.13296: running the handler 28983 1726883002.13316: _low_level_execute_command(): starting 28983 1726883002.13331: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726883002.14079: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883002.14098: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883002.14114: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883002.14142: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883002.14254: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883002.14288: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883002.14393: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883002.16388: stdout chunk (state=3): >>>/root <<< 28983 1726883002.16443: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883002.16510: stderr chunk (state=3): >>><<< 28983 1726883002.16513: stdout chunk (state=3): >>><<< 28983 1726883002.16729: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883002.16749: _low_level_execute_command(): starting 28983 1726883002.16754: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726883002.1658418-30203-271923302224581 `" && echo ansible-tmp-1726883002.1658418-30203-271923302224581="` echo /root/.ansible/tmp/ansible-tmp-1726883002.1658418-30203-271923302224581 `" ) && sleep 0' 28983 1726883002.17792: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883002.17802: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883002.17828: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883002.17839: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883002.17862: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726883002.17866: stderr chunk (state=3): >>>debug2: match not found <<< 28983 1726883002.17879: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883002.17917: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28983 1726883002.17921: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.46.139 is address <<< 28983 1726883002.17923: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28983 1726883002.17926: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883002.17965: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found <<< 28983 1726883002.17993: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883002.18017: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883002.18027: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883002.18103: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883002.20150: stdout chunk (state=3): >>>ansible-tmp-1726883002.1658418-30203-271923302224581=/root/.ansible/tmp/ansible-tmp-1726883002.1658418-30203-271923302224581 <<< 28983 1726883002.20294: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883002.20343: stderr chunk (state=3): >>><<< 28983 1726883002.20346: stdout chunk (state=3): >>><<< 28983 1726883002.20386: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726883002.1658418-30203-271923302224581=/root/.ansible/tmp/ansible-tmp-1726883002.1658418-30203-271923302224581 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883002.20414: variable 'ansible_module_compression' from source: unknown 28983 1726883002.20451: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 28983 1726883002.20479: variable 'ansible_facts' from source: unknown 28983 1726883002.20542: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726883002.1658418-30203-271923302224581/AnsiballZ_ping.py 28983 1726883002.20713: Sending initial data 28983 1726883002.20716: Sent initial data (153 bytes) 28983 1726883002.21354: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883002.21450: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883002.21465: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883002.21517: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883002.21626: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883002.23252: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 <<< 28983 1726883002.23260: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726883002.23321: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726883002.23387: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpr4aig781 /root/.ansible/tmp/ansible-tmp-1726883002.1658418-30203-271923302224581/AnsiballZ_ping.py <<< 28983 1726883002.23394: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726883002.1658418-30203-271923302224581/AnsiballZ_ping.py" <<< 28983 1726883002.23457: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpr4aig781" to remote "/root/.ansible/tmp/ansible-tmp-1726883002.1658418-30203-271923302224581/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726883002.1658418-30203-271923302224581/AnsiballZ_ping.py" <<< 28983 1726883002.24325: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883002.24381: stderr chunk (state=3): >>><<< 28983 1726883002.24384: stdout chunk (state=3): >>><<< 28983 1726883002.24403: done transferring module to remote 28983 1726883002.24415: _low_level_execute_command(): starting 28983 1726883002.24419: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726883002.1658418-30203-271923302224581/ /root/.ansible/tmp/ansible-tmp-1726883002.1658418-30203-271923302224581/AnsiballZ_ping.py && sleep 0' 28983 1726883002.24979: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883002.25044: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883002.27028: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883002.27032: stdout chunk (state=3): >>><<< 28983 1726883002.27036: stderr chunk (state=3): >>><<< 28983 1726883002.27056: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883002.27145: _low_level_execute_command(): starting 28983 1726883002.27154: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726883002.1658418-30203-271923302224581/AnsiballZ_ping.py && sleep 0' 28983 1726883002.27756: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883002.27862: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883002.27899: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883002.27914: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883002.27943: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883002.28083: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883002.45490: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 28983 1726883002.46927: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. <<< 28983 1726883002.46992: stderr chunk (state=3): >>><<< 28983 1726883002.46996: stdout chunk (state=3): >>><<< 28983 1726883002.47024: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726883002.47053: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726883002.1658418-30203-271923302224581/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726883002.47064: _low_level_execute_command(): starting 28983 1726883002.47070: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726883002.1658418-30203-271923302224581/ > /dev/null 2>&1 && sleep 0' 28983 1726883002.47543: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883002.47549: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726883002.47575: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883002.47658: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883002.47662: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883002.47740: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883002.49691: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883002.49736: stderr chunk (state=3): >>><<< 28983 1726883002.49739: stdout chunk (state=3): >>><<< 28983 1726883002.49755: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883002.49764: handler run complete 28983 1726883002.49792: attempt loop complete, returning result 28983 1726883002.49809: _execute() done 28983 1726883002.49812: dumping result to json 28983 1726883002.49815: done dumping result, returning 28983 1726883002.49817: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affe814-3a2d-b16d-c0a7-000000000744] 28983 1726883002.49826: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000744 28983 1726883002.49923: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000744 28983 1726883002.49926: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "ping": "pong" } 28983 1726883002.50000: no more pending results, returning what we have 28983 1726883002.50004: results queue empty 28983 1726883002.50005: checking for any_errors_fatal 28983 1726883002.50011: done checking for any_errors_fatal 28983 1726883002.50012: checking for max_fail_percentage 28983 1726883002.50014: done checking for max_fail_percentage 28983 1726883002.50015: checking to see if all hosts have failed and the running result is not ok 28983 1726883002.50016: done checking to see if all hosts have failed 28983 1726883002.50017: getting the remaining hosts for this loop 28983 1726883002.50019: done getting the remaining hosts for this loop 28983 1726883002.50024: getting the next task for host managed_node2 28983 1726883002.50037: done getting next task for host managed_node2 28983 1726883002.50041: ^ task is: TASK: meta (role_complete) 28983 1726883002.50046: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883002.50058: getting variables 28983 1726883002.50060: in VariableManager get_vars() 28983 1726883002.50103: Calling all_inventory to load vars for managed_node2 28983 1726883002.50106: Calling groups_inventory to load vars for managed_node2 28983 1726883002.50108: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883002.50118: Calling all_plugins_play to load vars for managed_node2 28983 1726883002.50121: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883002.50125: Calling groups_plugins_play to load vars for managed_node2 28983 1726883002.52648: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883002.54405: done with get_vars() 28983 1726883002.54430: done getting variables 28983 1726883002.54504: done queuing things up, now waiting for results queue to drain 28983 1726883002.54506: results queue empty 28983 1726883002.54507: checking for any_errors_fatal 28983 1726883002.54509: done checking for any_errors_fatal 28983 1726883002.54510: checking for max_fail_percentage 28983 1726883002.54510: done checking for max_fail_percentage 28983 1726883002.54511: checking to see if all hosts have failed and the running result is not ok 28983 1726883002.54512: done checking to see if all hosts have failed 28983 1726883002.54512: getting the remaining hosts for this loop 28983 1726883002.54513: done getting the remaining hosts for this loop 28983 1726883002.54516: getting the next task for host managed_node2 28983 1726883002.54520: done getting next task for host managed_node2 28983 1726883002.54522: ^ task is: TASK: Show result 28983 1726883002.54523: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883002.54526: getting variables 28983 1726883002.54527: in VariableManager get_vars() 28983 1726883002.54537: Calling all_inventory to load vars for managed_node2 28983 1726883002.54539: Calling groups_inventory to load vars for managed_node2 28983 1726883002.54541: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883002.54545: Calling all_plugins_play to load vars for managed_node2 28983 1726883002.54548: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883002.54550: Calling groups_plugins_play to load vars for managed_node2 28983 1726883002.56031: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883002.58407: done with get_vars() 28983 1726883002.58428: done getting variables 28983 1726883002.58467: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show result] ************************************************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile_no_autoconnect.yml:15 Friday 20 September 2024 21:43:22 -0400 (0:00:00.486) 0:00:32.582 ****** 28983 1726883002.58495: entering _queue_task() for managed_node2/debug 28983 1726883002.58764: worker is 1 (out of 1 available) 28983 1726883002.58782: exiting _queue_task() for managed_node2/debug 28983 1726883002.58796: done queuing things up, now waiting for results queue to drain 28983 1726883002.58798: waiting for pending results... 28983 1726883002.59006: running TaskExecutor() for managed_node2/TASK: Show result 28983 1726883002.59104: in run() - task 0affe814-3a2d-b16d-c0a7-0000000006b2 28983 1726883002.59118: variable 'ansible_search_path' from source: unknown 28983 1726883002.59121: variable 'ansible_search_path' from source: unknown 28983 1726883002.59158: calling self._execute() 28983 1726883002.59245: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883002.59252: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883002.59262: variable 'omit' from source: magic vars 28983 1726883002.59713: variable 'ansible_distribution_major_version' from source: facts 28983 1726883002.59716: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883002.59719: variable 'omit' from source: magic vars 28983 1726883002.59721: variable 'omit' from source: magic vars 28983 1726883002.59973: variable 'omit' from source: magic vars 28983 1726883002.59977: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883002.59980: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883002.59983: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883002.59985: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883002.59987: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883002.59990: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883002.59992: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883002.59994: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883002.60043: Set connection var ansible_connection to ssh 28983 1726883002.60057: Set connection var ansible_shell_executable to /bin/sh 28983 1726883002.60068: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883002.60084: Set connection var ansible_timeout to 10 28983 1726883002.60092: Set connection var ansible_pipelining to False 28983 1726883002.60094: Set connection var ansible_shell_type to sh 28983 1726883002.60122: variable 'ansible_shell_executable' from source: unknown 28983 1726883002.60126: variable 'ansible_connection' from source: unknown 28983 1726883002.60129: variable 'ansible_module_compression' from source: unknown 28983 1726883002.60229: variable 'ansible_shell_type' from source: unknown 28983 1726883002.60232: variable 'ansible_shell_executable' from source: unknown 28983 1726883002.60237: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883002.60239: variable 'ansible_pipelining' from source: unknown 28983 1726883002.60241: variable 'ansible_timeout' from source: unknown 28983 1726883002.60243: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883002.60336: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883002.60351: variable 'omit' from source: magic vars 28983 1726883002.60355: starting attempt loop 28983 1726883002.60357: running the handler 28983 1726883002.60564: variable '__network_connections_result' from source: set_fact 28983 1726883002.60567: variable '__network_connections_result' from source: set_fact 28983 1726883002.60653: handler run complete 28983 1726883002.60683: attempt loop complete, returning result 28983 1726883002.60687: _execute() done 28983 1726883002.60690: dumping result to json 28983 1726883002.60697: done dumping result, returning 28983 1726883002.60706: done running TaskExecutor() for managed_node2/TASK: Show result [0affe814-3a2d-b16d-c0a7-0000000006b2] 28983 1726883002.60711: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000006b2 ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": false, "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 1a065f38-1816-45bb-8e19-5c41e45c0397\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 1a065f38-1816-45bb-8e19-5c41e45c0397" ] } } 28983 1726883002.61046: no more pending results, returning what we have 28983 1726883002.61051: results queue empty 28983 1726883002.61052: checking for any_errors_fatal 28983 1726883002.61054: done checking for any_errors_fatal 28983 1726883002.61055: checking for max_fail_percentage 28983 1726883002.61057: done checking for max_fail_percentage 28983 1726883002.61058: checking to see if all hosts have failed and the running result is not ok 28983 1726883002.61059: done checking to see if all hosts have failed 28983 1726883002.61060: getting the remaining hosts for this loop 28983 1726883002.61062: done getting the remaining hosts for this loop 28983 1726883002.61067: getting the next task for host managed_node2 28983 1726883002.61076: done getting next task for host managed_node2 28983 1726883002.61081: ^ task is: TASK: Asserts 28983 1726883002.61084: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883002.61089: getting variables 28983 1726883002.61090: in VariableManager get_vars() 28983 1726883002.61252: Calling all_inventory to load vars for managed_node2 28983 1726883002.61256: Calling groups_inventory to load vars for managed_node2 28983 1726883002.61261: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883002.61274: Calling all_plugins_play to load vars for managed_node2 28983 1726883002.61278: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883002.61283: Calling groups_plugins_play to load vars for managed_node2 28983 1726883002.62056: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000006b2 28983 1726883002.62060: WORKER PROCESS EXITING 28983 1726883002.68258: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883002.70999: done with get_vars() 28983 1726883002.71033: done getting variables TASK [Asserts] ***************************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:36 Friday 20 September 2024 21:43:22 -0400 (0:00:00.126) 0:00:32.709 ****** 28983 1726883002.71130: entering _queue_task() for managed_node2/include_tasks 28983 1726883002.71492: worker is 1 (out of 1 available) 28983 1726883002.71507: exiting _queue_task() for managed_node2/include_tasks 28983 1726883002.71519: done queuing things up, now waiting for results queue to drain 28983 1726883002.71521: waiting for pending results... 28983 1726883002.71799: running TaskExecutor() for managed_node2/TASK: Asserts 28983 1726883002.72000: in run() - task 0affe814-3a2d-b16d-c0a7-0000000005b9 28983 1726883002.72005: variable 'ansible_search_path' from source: unknown 28983 1726883002.72008: variable 'ansible_search_path' from source: unknown 28983 1726883002.72107: variable 'lsr_assert' from source: include params 28983 1726883002.72297: variable 'lsr_assert' from source: include params 28983 1726883002.72386: variable 'omit' from source: magic vars 28983 1726883002.72553: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883002.72673: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883002.72678: variable 'omit' from source: magic vars 28983 1726883002.72928: variable 'ansible_distribution_major_version' from source: facts 28983 1726883002.72947: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883002.72959: variable 'item' from source: unknown 28983 1726883002.73053: variable 'item' from source: unknown 28983 1726883002.73096: variable 'item' from source: unknown 28983 1726883002.73185: variable 'item' from source: unknown 28983 1726883002.73547: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883002.73551: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883002.73554: variable 'omit' from source: magic vars 28983 1726883002.73670: variable 'ansible_distribution_major_version' from source: facts 28983 1726883002.73684: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883002.73697: variable 'item' from source: unknown 28983 1726883002.73785: variable 'item' from source: unknown 28983 1726883002.73828: variable 'item' from source: unknown 28983 1726883002.73917: variable 'item' from source: unknown 28983 1726883002.74127: dumping result to json 28983 1726883002.74130: done dumping result, returning 28983 1726883002.74132: done running TaskExecutor() for managed_node2/TASK: Asserts [0affe814-3a2d-b16d-c0a7-0000000005b9] 28983 1726883002.74136: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000005b9 28983 1726883002.74180: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000005b9 28983 1726883002.74183: WORKER PROCESS EXITING 28983 1726883002.74262: no more pending results, returning what we have 28983 1726883002.74268: in VariableManager get_vars() 28983 1726883002.74308: Calling all_inventory to load vars for managed_node2 28983 1726883002.74312: Calling groups_inventory to load vars for managed_node2 28983 1726883002.74316: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883002.74328: Calling all_plugins_play to load vars for managed_node2 28983 1726883002.74332: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883002.74337: Calling groups_plugins_play to load vars for managed_node2 28983 1726883002.76722: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883002.80240: done with get_vars() 28983 1726883002.80278: variable 'ansible_search_path' from source: unknown 28983 1726883002.80279: variable 'ansible_search_path' from source: unknown 28983 1726883002.80331: variable 'ansible_search_path' from source: unknown 28983 1726883002.80333: variable 'ansible_search_path' from source: unknown 28983 1726883002.80370: we have included files to process 28983 1726883002.80376: generating all_blocks data 28983 1726883002.80382: done generating all_blocks data 28983 1726883002.80390: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 28983 1726883002.80392: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 28983 1726883002.80399: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 28983 1726883002.80543: in VariableManager get_vars() 28983 1726883002.80567: done with get_vars() 28983 1726883002.80735: done processing included file 28983 1726883002.80738: iterating over new_blocks loaded from include file 28983 1726883002.80739: in VariableManager get_vars() 28983 1726883002.80756: done with get_vars() 28983 1726883002.80758: filtering new block on tags 28983 1726883002.80800: done filtering new block on tags 28983 1726883002.80803: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed_node2 => (item=tasks/assert_device_absent.yml) 28983 1726883002.80814: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 28983 1726883002.80816: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 28983 1726883002.80819: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 28983 1726883002.80966: in VariableManager get_vars() 28983 1726883002.80988: done with get_vars() 28983 1726883002.81304: done processing included file 28983 1726883002.81306: iterating over new_blocks loaded from include file 28983 1726883002.81308: in VariableManager get_vars() 28983 1726883002.81324: done with get_vars() 28983 1726883002.81326: filtering new block on tags 28983 1726883002.81406: done filtering new block on tags 28983 1726883002.81409: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node2 => (item=tasks/assert_profile_present.yml) 28983 1726883002.81414: extending task lists for all hosts with included blocks 28983 1726883002.83373: done extending task lists 28983 1726883002.83374: done processing included files 28983 1726883002.83375: results queue empty 28983 1726883002.83376: checking for any_errors_fatal 28983 1726883002.83384: done checking for any_errors_fatal 28983 1726883002.83385: checking for max_fail_percentage 28983 1726883002.83386: done checking for max_fail_percentage 28983 1726883002.83387: checking to see if all hosts have failed and the running result is not ok 28983 1726883002.83388: done checking to see if all hosts have failed 28983 1726883002.83389: getting the remaining hosts for this loop 28983 1726883002.83391: done getting the remaining hosts for this loop 28983 1726883002.83394: getting the next task for host managed_node2 28983 1726883002.83398: done getting next task for host managed_node2 28983 1726883002.83401: ^ task is: TASK: Include the task 'get_interface_stat.yml' 28983 1726883002.83405: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883002.83413: getting variables 28983 1726883002.83414: in VariableManager get_vars() 28983 1726883002.83424: Calling all_inventory to load vars for managed_node2 28983 1726883002.83440: Calling groups_inventory to load vars for managed_node2 28983 1726883002.83444: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883002.83451: Calling all_plugins_play to load vars for managed_node2 28983 1726883002.83455: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883002.83459: Calling groups_plugins_play to load vars for managed_node2 28983 1726883002.86198: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883002.89409: done with get_vars() 28983 1726883002.89452: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Friday 20 September 2024 21:43:22 -0400 (0:00:00.184) 0:00:32.893 ****** 28983 1726883002.89551: entering _queue_task() for managed_node2/include_tasks 28983 1726883002.89987: worker is 1 (out of 1 available) 28983 1726883002.90000: exiting _queue_task() for managed_node2/include_tasks 28983 1726883002.90014: done queuing things up, now waiting for results queue to drain 28983 1726883002.90016: waiting for pending results... 28983 1726883002.90331: running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' 28983 1726883002.90447: in run() - task 0affe814-3a2d-b16d-c0a7-0000000008a8 28983 1726883002.90459: variable 'ansible_search_path' from source: unknown 28983 1726883002.90462: variable 'ansible_search_path' from source: unknown 28983 1726883002.90506: calling self._execute() 28983 1726883002.90622: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883002.90630: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883002.90645: variable 'omit' from source: magic vars 28983 1726883002.91126: variable 'ansible_distribution_major_version' from source: facts 28983 1726883002.91140: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883002.91147: _execute() done 28983 1726883002.91150: dumping result to json 28983 1726883002.91191: done dumping result, returning 28983 1726883002.91196: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' [0affe814-3a2d-b16d-c0a7-0000000008a8] 28983 1726883002.91199: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000008a8 28983 1726883002.91284: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000008a8 28983 1726883002.91287: WORKER PROCESS EXITING 28983 1726883002.91341: no more pending results, returning what we have 28983 1726883002.91348: in VariableManager get_vars() 28983 1726883002.91391: Calling all_inventory to load vars for managed_node2 28983 1726883002.91395: Calling groups_inventory to load vars for managed_node2 28983 1726883002.91399: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883002.91414: Calling all_plugins_play to load vars for managed_node2 28983 1726883002.91419: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883002.91423: Calling groups_plugins_play to load vars for managed_node2 28983 1726883002.94179: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883002.99979: done with get_vars() 28983 1726883003.00016: variable 'ansible_search_path' from source: unknown 28983 1726883003.00018: variable 'ansible_search_path' from source: unknown 28983 1726883003.00028: variable 'item' from source: include params 28983 1726883003.00369: variable 'item' from source: include params 28983 1726883003.00415: we have included files to process 28983 1726883003.00416: generating all_blocks data 28983 1726883003.00419: done generating all_blocks data 28983 1726883003.00420: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 28983 1726883003.00422: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 28983 1726883003.00425: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 28983 1726883003.01170: done processing included file 28983 1726883003.01175: iterating over new_blocks loaded from include file 28983 1726883003.01177: in VariableManager get_vars() 28983 1726883003.01199: done with get_vars() 28983 1726883003.01202: filtering new block on tags 28983 1726883003.01238: done filtering new block on tags 28983 1726883003.01241: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node2 28983 1726883003.01247: extending task lists for all hosts with included blocks 28983 1726883003.01692: done extending task lists 28983 1726883003.01694: done processing included files 28983 1726883003.01694: results queue empty 28983 1726883003.01695: checking for any_errors_fatal 28983 1726883003.01700: done checking for any_errors_fatal 28983 1726883003.01701: checking for max_fail_percentage 28983 1726883003.01702: done checking for max_fail_percentage 28983 1726883003.01703: checking to see if all hosts have failed and the running result is not ok 28983 1726883003.01704: done checking to see if all hosts have failed 28983 1726883003.01705: getting the remaining hosts for this loop 28983 1726883003.01707: done getting the remaining hosts for this loop 28983 1726883003.01710: getting the next task for host managed_node2 28983 1726883003.01715: done getting next task for host managed_node2 28983 1726883003.01717: ^ task is: TASK: Get stat for interface {{ interface }} 28983 1726883003.01721: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883003.01723: getting variables 28983 1726883003.01724: in VariableManager get_vars() 28983 1726883003.01939: Calling all_inventory to load vars for managed_node2 28983 1726883003.01943: Calling groups_inventory to load vars for managed_node2 28983 1726883003.01946: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883003.01953: Calling all_plugins_play to load vars for managed_node2 28983 1726883003.01956: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883003.01960: Calling groups_plugins_play to load vars for managed_node2 28983 1726883003.06346: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883003.09796: done with get_vars() 28983 1726883003.09830: done getting variables 28983 1726883003.10288: variable 'interface' from source: play vars TASK [Get stat for interface statebr] ****************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:43:23 -0400 (0:00:00.207) 0:00:33.101 ****** 28983 1726883003.10324: entering _queue_task() for managed_node2/stat 28983 1726883003.11102: worker is 1 (out of 1 available) 28983 1726883003.11115: exiting _queue_task() for managed_node2/stat 28983 1726883003.11128: done queuing things up, now waiting for results queue to drain 28983 1726883003.11129: waiting for pending results... 28983 1726883003.11954: running TaskExecutor() for managed_node2/TASK: Get stat for interface statebr 28983 1726883003.12196: in run() - task 0affe814-3a2d-b16d-c0a7-000000000928 28983 1726883003.12200: variable 'ansible_search_path' from source: unknown 28983 1726883003.12202: variable 'ansible_search_path' from source: unknown 28983 1726883003.12206: calling self._execute() 28983 1726883003.12370: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883003.12422: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883003.12442: variable 'omit' from source: magic vars 28983 1726883003.13318: variable 'ansible_distribution_major_version' from source: facts 28983 1726883003.13413: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883003.13440: variable 'omit' from source: magic vars 28983 1726883003.13554: variable 'omit' from source: magic vars 28983 1726883003.14039: variable 'interface' from source: play vars 28983 1726883003.14043: variable 'omit' from source: magic vars 28983 1726883003.14047: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883003.14068: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883003.14100: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883003.14168: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883003.14187: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883003.14341: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883003.14344: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883003.14347: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883003.14539: Set connection var ansible_connection to ssh 28983 1726883003.14612: Set connection var ansible_shell_executable to /bin/sh 28983 1726883003.14630: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883003.14814: Set connection var ansible_timeout to 10 28983 1726883003.14818: Set connection var ansible_pipelining to False 28983 1726883003.14820: Set connection var ansible_shell_type to sh 28983 1726883003.14822: variable 'ansible_shell_executable' from source: unknown 28983 1726883003.14830: variable 'ansible_connection' from source: unknown 28983 1726883003.14835: variable 'ansible_module_compression' from source: unknown 28983 1726883003.14837: variable 'ansible_shell_type' from source: unknown 28983 1726883003.14840: variable 'ansible_shell_executable' from source: unknown 28983 1726883003.14918: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883003.14924: variable 'ansible_pipelining' from source: unknown 28983 1726883003.14927: variable 'ansible_timeout' from source: unknown 28983 1726883003.14929: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883003.15340: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28983 1726883003.15377: variable 'omit' from source: magic vars 28983 1726883003.15541: starting attempt loop 28983 1726883003.15544: running the handler 28983 1726883003.15546: _low_level_execute_command(): starting 28983 1726883003.15549: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726883003.16961: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883003.16976: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883003.17255: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883003.17338: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883003.17390: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883003.17444: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883003.17789: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883003.19570: stdout chunk (state=3): >>>/root <<< 28983 1726883003.19709: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883003.19833: stderr chunk (state=3): >>><<< 28983 1726883003.19847: stdout chunk (state=3): >>><<< 28983 1726883003.20058: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883003.20062: _low_level_execute_command(): starting 28983 1726883003.20066: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726883003.1996558-30235-176206516156798 `" && echo ansible-tmp-1726883003.1996558-30235-176206516156798="` echo /root/.ansible/tmp/ansible-tmp-1726883003.1996558-30235-176206516156798 `" ) && sleep 0' 28983 1726883003.21300: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883003.21412: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883003.21490: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883003.21652: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883003.21752: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883003.23801: stdout chunk (state=3): >>>ansible-tmp-1726883003.1996558-30235-176206516156798=/root/.ansible/tmp/ansible-tmp-1726883003.1996558-30235-176206516156798 <<< 28983 1726883003.23969: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883003.23980: stdout chunk (state=3): >>><<< 28983 1726883003.23996: stderr chunk (state=3): >>><<< 28983 1726883003.24344: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726883003.1996558-30235-176206516156798=/root/.ansible/tmp/ansible-tmp-1726883003.1996558-30235-176206516156798 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883003.24348: variable 'ansible_module_compression' from source: unknown 28983 1726883003.24350: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 28983 1726883003.24352: variable 'ansible_facts' from source: unknown 28983 1726883003.24507: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726883003.1996558-30235-176206516156798/AnsiballZ_stat.py 28983 1726883003.25040: Sending initial data 28983 1726883003.25043: Sent initial data (153 bytes) 28983 1726883003.26750: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883003.26816: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883003.27082: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883003.27103: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883003.27199: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883003.28912: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 28983 1726883003.29052: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726883003.29120: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726883003.29194: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmp3hy19q7p /root/.ansible/tmp/ansible-tmp-1726883003.1996558-30235-176206516156798/AnsiballZ_stat.py <<< 28983 1726883003.29205: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726883003.1996558-30235-176206516156798/AnsiballZ_stat.py" <<< 28983 1726883003.29265: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmp3hy19q7p" to remote "/root/.ansible/tmp/ansible-tmp-1726883003.1996558-30235-176206516156798/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726883003.1996558-30235-176206516156798/AnsiballZ_stat.py" <<< 28983 1726883003.32238: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883003.32453: stderr chunk (state=3): >>><<< 28983 1726883003.32489: stdout chunk (state=3): >>><<< 28983 1726883003.32523: done transferring module to remote 28983 1726883003.32580: _low_level_execute_command(): starting 28983 1726883003.32618: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726883003.1996558-30235-176206516156798/ /root/.ansible/tmp/ansible-tmp-1726883003.1996558-30235-176206516156798/AnsiballZ_stat.py && sleep 0' 28983 1726883003.33943: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883003.33967: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883003.33981: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883003.34001: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883003.34174: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883003.34299: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883003.34316: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883003.34565: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883003.36741: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883003.36744: stdout chunk (state=3): >>><<< 28983 1726883003.36747: stderr chunk (state=3): >>><<< 28983 1726883003.36810: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883003.36985: _low_level_execute_command(): starting 28983 1726883003.36989: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726883003.1996558-30235-176206516156798/AnsiballZ_stat.py && sleep 0' 28983 1726883003.38200: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883003.38204: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883003.38207: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883003.38210: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883003.38213: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726883003.38215: stderr chunk (state=3): >>>debug2: match not found <<< 28983 1726883003.38218: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883003.38220: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28983 1726883003.38222: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.46.139 is address <<< 28983 1726883003.38225: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28983 1726883003.38310: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883003.38314: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883003.38317: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883003.38319: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726883003.38322: stderr chunk (state=3): >>>debug2: match found <<< 28983 1726883003.38343: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883003.38607: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883003.38687: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883003.56293: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 28983 1726883003.57841: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883003.57852: stderr chunk (state=3): >>>Shared connection to 10.31.46.139 closed. <<< 28983 1726883003.57858: stdout chunk (state=3): >>><<< 28983 1726883003.57868: stderr chunk (state=3): >>><<< 28983 1726883003.57892: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726883003.57931: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726883003.1996558-30235-176206516156798/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726883003.57945: _low_level_execute_command(): starting 28983 1726883003.57951: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726883003.1996558-30235-176206516156798/ > /dev/null 2>&1 && sleep 0' 28983 1726883003.59086: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883003.59284: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883003.59347: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883003.59393: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883003.59642: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883003.61727: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883003.61732: stdout chunk (state=3): >>><<< 28983 1726883003.61742: stderr chunk (state=3): >>><<< 28983 1726883003.61760: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883003.61767: handler run complete 28983 1726883003.61798: attempt loop complete, returning result 28983 1726883003.61802: _execute() done 28983 1726883003.61805: dumping result to json 28983 1726883003.61810: done dumping result, returning 28983 1726883003.61822: done running TaskExecutor() for managed_node2/TASK: Get stat for interface statebr [0affe814-3a2d-b16d-c0a7-000000000928] 28983 1726883003.61828: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000928 28983 1726883003.61947: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000928 28983 1726883003.61951: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 28983 1726883003.62031: no more pending results, returning what we have 28983 1726883003.62040: results queue empty 28983 1726883003.62041: checking for any_errors_fatal 28983 1726883003.62044: done checking for any_errors_fatal 28983 1726883003.62045: checking for max_fail_percentage 28983 1726883003.62047: done checking for max_fail_percentage 28983 1726883003.62048: checking to see if all hosts have failed and the running result is not ok 28983 1726883003.62049: done checking to see if all hosts have failed 28983 1726883003.62050: getting the remaining hosts for this loop 28983 1726883003.62053: done getting the remaining hosts for this loop 28983 1726883003.62065: getting the next task for host managed_node2 28983 1726883003.62077: done getting next task for host managed_node2 28983 1726883003.62081: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 28983 1726883003.62087: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883003.62094: getting variables 28983 1726883003.62096: in VariableManager get_vars() 28983 1726883003.62317: Calling all_inventory to load vars for managed_node2 28983 1726883003.62321: Calling groups_inventory to load vars for managed_node2 28983 1726883003.62326: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883003.62340: Calling all_plugins_play to load vars for managed_node2 28983 1726883003.62345: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883003.62349: Calling groups_plugins_play to load vars for managed_node2 28983 1726883003.68119: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883003.74995: done with get_vars() 28983 1726883003.75041: done getting variables 28983 1726883003.75207: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 28983 1726883003.75357: variable 'interface' from source: play vars TASK [Assert that the interface is absent - 'statebr'] ************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Friday 20 September 2024 21:43:23 -0400 (0:00:00.650) 0:00:33.751 ****** 28983 1726883003.75420: entering _queue_task() for managed_node2/assert 28983 1726883003.76485: worker is 1 (out of 1 available) 28983 1726883003.76499: exiting _queue_task() for managed_node2/assert 28983 1726883003.76515: done queuing things up, now waiting for results queue to drain 28983 1726883003.76517: waiting for pending results... 28983 1726883003.77206: running TaskExecutor() for managed_node2/TASK: Assert that the interface is absent - 'statebr' 28983 1726883003.77485: in run() - task 0affe814-3a2d-b16d-c0a7-0000000008a9 28983 1726883003.77575: variable 'ansible_search_path' from source: unknown 28983 1726883003.77581: variable 'ansible_search_path' from source: unknown 28983 1726883003.77626: calling self._execute() 28983 1726883003.77938: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883003.77950: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883003.77962: variable 'omit' from source: magic vars 28983 1726883003.78897: variable 'ansible_distribution_major_version' from source: facts 28983 1726883003.78912: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883003.79014: variable 'omit' from source: magic vars 28983 1726883003.79152: variable 'omit' from source: magic vars 28983 1726883003.79493: variable 'interface' from source: play vars 28983 1726883003.79497: variable 'omit' from source: magic vars 28983 1726883003.79561: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883003.79765: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883003.79938: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883003.79965: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883003.79979: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883003.80017: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883003.80021: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883003.80024: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883003.80676: Set connection var ansible_connection to ssh 28983 1726883003.80688: Set connection var ansible_shell_executable to /bin/sh 28983 1726883003.80700: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883003.80721: Set connection var ansible_timeout to 10 28983 1726883003.80729: Set connection var ansible_pipelining to False 28983 1726883003.80732: Set connection var ansible_shell_type to sh 28983 1726883003.80761: variable 'ansible_shell_executable' from source: unknown 28983 1726883003.80765: variable 'ansible_connection' from source: unknown 28983 1726883003.80768: variable 'ansible_module_compression' from source: unknown 28983 1726883003.80775: variable 'ansible_shell_type' from source: unknown 28983 1726883003.80778: variable 'ansible_shell_executable' from source: unknown 28983 1726883003.80781: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883003.80786: variable 'ansible_pipelining' from source: unknown 28983 1726883003.80790: variable 'ansible_timeout' from source: unknown 28983 1726883003.80806: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883003.81441: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883003.81539: variable 'omit' from source: magic vars 28983 1726883003.81543: starting attempt loop 28983 1726883003.81546: running the handler 28983 1726883003.81943: variable 'interface_stat' from source: set_fact 28983 1726883003.81956: Evaluated conditional (not interface_stat.stat.exists): True 28983 1726883003.81964: handler run complete 28983 1726883003.81984: attempt loop complete, returning result 28983 1726883003.81987: _execute() done 28983 1726883003.81990: dumping result to json 28983 1726883003.82117: done dumping result, returning 28983 1726883003.82120: done running TaskExecutor() for managed_node2/TASK: Assert that the interface is absent - 'statebr' [0affe814-3a2d-b16d-c0a7-0000000008a9] 28983 1726883003.82124: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000008a9 ok: [managed_node2] => { "changed": false } MSG: All assertions passed 28983 1726883003.82287: no more pending results, returning what we have 28983 1726883003.82291: results queue empty 28983 1726883003.82292: checking for any_errors_fatal 28983 1726883003.82304: done checking for any_errors_fatal 28983 1726883003.82305: checking for max_fail_percentage 28983 1726883003.82307: done checking for max_fail_percentage 28983 1726883003.82308: checking to see if all hosts have failed and the running result is not ok 28983 1726883003.82310: done checking to see if all hosts have failed 28983 1726883003.82311: getting the remaining hosts for this loop 28983 1726883003.82313: done getting the remaining hosts for this loop 28983 1726883003.82318: getting the next task for host managed_node2 28983 1726883003.82330: done getting next task for host managed_node2 28983 1726883003.82336: ^ task is: TASK: Include the task 'get_profile_stat.yml' 28983 1726883003.82341: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883003.82348: getting variables 28983 1726883003.82350: in VariableManager get_vars() 28983 1726883003.82389: Calling all_inventory to load vars for managed_node2 28983 1726883003.82392: Calling groups_inventory to load vars for managed_node2 28983 1726883003.82397: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883003.82407: Calling all_plugins_play to load vars for managed_node2 28983 1726883003.82411: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883003.82414: Calling groups_plugins_play to load vars for managed_node2 28983 1726883003.83040: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000008a9 28983 1726883003.83044: WORKER PROCESS EXITING 28983 1726883003.87482: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883003.93474: done with get_vars() 28983 1726883003.93524: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Friday 20 September 2024 21:43:23 -0400 (0:00:00.184) 0:00:33.936 ****** 28983 1726883003.93850: entering _queue_task() for managed_node2/include_tasks 28983 1726883003.94650: worker is 1 (out of 1 available) 28983 1726883003.94663: exiting _queue_task() for managed_node2/include_tasks 28983 1726883003.94678: done queuing things up, now waiting for results queue to drain 28983 1726883003.94680: waiting for pending results... 28983 1726883003.95304: running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' 28983 1726883003.95878: in run() - task 0affe814-3a2d-b16d-c0a7-0000000008ad 28983 1726883003.95891: variable 'ansible_search_path' from source: unknown 28983 1726883003.95895: variable 'ansible_search_path' from source: unknown 28983 1726883003.95939: calling self._execute() 28983 1726883003.96469: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883003.96477: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883003.96577: variable 'omit' from source: magic vars 28983 1726883003.98217: variable 'ansible_distribution_major_version' from source: facts 28983 1726883003.98233: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883003.98242: _execute() done 28983 1726883003.98246: dumping result to json 28983 1726883003.98257: done dumping result, returning 28983 1726883003.98267: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' [0affe814-3a2d-b16d-c0a7-0000000008ad] 28983 1726883003.98335: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000008ad 28983 1726883003.98502: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000008ad 28983 1726883003.98505: WORKER PROCESS EXITING 28983 1726883003.98540: no more pending results, returning what we have 28983 1726883003.98546: in VariableManager get_vars() 28983 1726883003.98596: Calling all_inventory to load vars for managed_node2 28983 1726883003.98600: Calling groups_inventory to load vars for managed_node2 28983 1726883003.98605: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883003.98621: Calling all_plugins_play to load vars for managed_node2 28983 1726883003.98625: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883003.98629: Calling groups_plugins_play to load vars for managed_node2 28983 1726883004.04452: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883004.11906: done with get_vars() 28983 1726883004.11957: variable 'ansible_search_path' from source: unknown 28983 1726883004.11959: variable 'ansible_search_path' from source: unknown 28983 1726883004.11973: variable 'item' from source: include params 28983 1726883004.12335: variable 'item' from source: include params 28983 1726883004.12388: we have included files to process 28983 1726883004.12390: generating all_blocks data 28983 1726883004.12392: done generating all_blocks data 28983 1726883004.12397: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 28983 1726883004.12399: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 28983 1726883004.12402: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 28983 1726883004.15281: done processing included file 28983 1726883004.15284: iterating over new_blocks loaded from include file 28983 1726883004.15286: in VariableManager get_vars() 28983 1726883004.15308: done with get_vars() 28983 1726883004.15311: filtering new block on tags 28983 1726883004.15415: done filtering new block on tags 28983 1726883004.15418: in VariableManager get_vars() 28983 1726883004.15643: done with get_vars() 28983 1726883004.15646: filtering new block on tags 28983 1726883004.15730: done filtering new block on tags 28983 1726883004.15735: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node2 28983 1726883004.15741: extending task lists for all hosts with included blocks 28983 1726883004.16855: done extending task lists 28983 1726883004.16860: done processing included files 28983 1726883004.16862: results queue empty 28983 1726883004.16863: checking for any_errors_fatal 28983 1726883004.16867: done checking for any_errors_fatal 28983 1726883004.16868: checking for max_fail_percentage 28983 1726883004.16869: done checking for max_fail_percentage 28983 1726883004.16870: checking to see if all hosts have failed and the running result is not ok 28983 1726883004.16871: done checking to see if all hosts have failed 28983 1726883004.16872: getting the remaining hosts for this loop 28983 1726883004.16874: done getting the remaining hosts for this loop 28983 1726883004.16877: getting the next task for host managed_node2 28983 1726883004.16884: done getting next task for host managed_node2 28983 1726883004.16888: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 28983 1726883004.16892: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883004.16896: getting variables 28983 1726883004.16897: in VariableManager get_vars() 28983 1726883004.16908: Calling all_inventory to load vars for managed_node2 28983 1726883004.16911: Calling groups_inventory to load vars for managed_node2 28983 1726883004.16914: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883004.16923: Calling all_plugins_play to load vars for managed_node2 28983 1726883004.16928: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883004.16932: Calling groups_plugins_play to load vars for managed_node2 28983 1726883004.21186: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883004.26897: done with get_vars() 28983 1726883004.26939: done getting variables 28983 1726883004.26994: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 21:43:24 -0400 (0:00:00.331) 0:00:34.268 ****** 28983 1726883004.27033: entering _queue_task() for managed_node2/set_fact 28983 1726883004.28273: worker is 1 (out of 1 available) 28983 1726883004.28287: exiting _queue_task() for managed_node2/set_fact 28983 1726883004.28303: done queuing things up, now waiting for results queue to drain 28983 1726883004.28305: waiting for pending results... 28983 1726883004.28865: running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag 28983 1726883004.28933: in run() - task 0affe814-3a2d-b16d-c0a7-000000000946 28983 1726883004.28965: variable 'ansible_search_path' from source: unknown 28983 1726883004.28974: variable 'ansible_search_path' from source: unknown 28983 1726883004.29087: calling self._execute() 28983 1726883004.29396: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883004.29399: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883004.29403: variable 'omit' from source: magic vars 28983 1726883004.30564: variable 'ansible_distribution_major_version' from source: facts 28983 1726883004.30585: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883004.30651: variable 'omit' from source: magic vars 28983 1726883004.30906: variable 'omit' from source: magic vars 28983 1726883004.30960: variable 'omit' from source: magic vars 28983 1726883004.31218: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883004.31222: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883004.31277: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883004.31367: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883004.31579: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883004.31583: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883004.31586: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883004.31588: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883004.31731: Set connection var ansible_connection to ssh 28983 1726883004.31752: Set connection var ansible_shell_executable to /bin/sh 28983 1726883004.31772: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883004.31788: Set connection var ansible_timeout to 10 28983 1726883004.31801: Set connection var ansible_pipelining to False 28983 1726883004.31808: Set connection var ansible_shell_type to sh 28983 1726883004.31839: variable 'ansible_shell_executable' from source: unknown 28983 1726883004.31849: variable 'ansible_connection' from source: unknown 28983 1726883004.31858: variable 'ansible_module_compression' from source: unknown 28983 1726883004.31870: variable 'ansible_shell_type' from source: unknown 28983 1726883004.31877: variable 'ansible_shell_executable' from source: unknown 28983 1726883004.31885: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883004.31894: variable 'ansible_pipelining' from source: unknown 28983 1726883004.31902: variable 'ansible_timeout' from source: unknown 28983 1726883004.31910: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883004.32076: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883004.32102: variable 'omit' from source: magic vars 28983 1726883004.32194: starting attempt loop 28983 1726883004.32197: running the handler 28983 1726883004.32200: handler run complete 28983 1726883004.32202: attempt loop complete, returning result 28983 1726883004.32204: _execute() done 28983 1726883004.32206: dumping result to json 28983 1726883004.32208: done dumping result, returning 28983 1726883004.32210: done running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag [0affe814-3a2d-b16d-c0a7-000000000946] 28983 1726883004.32213: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000946 ok: [managed_node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 28983 1726883004.32363: no more pending results, returning what we have 28983 1726883004.32368: results queue empty 28983 1726883004.32369: checking for any_errors_fatal 28983 1726883004.32371: done checking for any_errors_fatal 28983 1726883004.32372: checking for max_fail_percentage 28983 1726883004.32374: done checking for max_fail_percentage 28983 1726883004.32375: checking to see if all hosts have failed and the running result is not ok 28983 1726883004.32375: done checking to see if all hosts have failed 28983 1726883004.32376: getting the remaining hosts for this loop 28983 1726883004.32379: done getting the remaining hosts for this loop 28983 1726883004.32385: getting the next task for host managed_node2 28983 1726883004.32395: done getting next task for host managed_node2 28983 1726883004.32398: ^ task is: TASK: Stat profile file 28983 1726883004.32405: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883004.32411: getting variables 28983 1726883004.32412: in VariableManager get_vars() 28983 1726883004.32454: Calling all_inventory to load vars for managed_node2 28983 1726883004.32458: Calling groups_inventory to load vars for managed_node2 28983 1726883004.32462: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883004.32475: Calling all_plugins_play to load vars for managed_node2 28983 1726883004.32479: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883004.32484: Calling groups_plugins_play to load vars for managed_node2 28983 1726883004.33447: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000946 28983 1726883004.33451: WORKER PROCESS EXITING 28983 1726883004.35617: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883004.41077: done with get_vars() 28983 1726883004.41130: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 21:43:24 -0400 (0:00:00.142) 0:00:34.410 ****** 28983 1726883004.41322: entering _queue_task() for managed_node2/stat 28983 1726883004.41799: worker is 1 (out of 1 available) 28983 1726883004.41813: exiting _queue_task() for managed_node2/stat 28983 1726883004.41828: done queuing things up, now waiting for results queue to drain 28983 1726883004.41833: waiting for pending results... 28983 1726883004.42152: running TaskExecutor() for managed_node2/TASK: Stat profile file 28983 1726883004.42323: in run() - task 0affe814-3a2d-b16d-c0a7-000000000947 28983 1726883004.42337: variable 'ansible_search_path' from source: unknown 28983 1726883004.42341: variable 'ansible_search_path' from source: unknown 28983 1726883004.42397: calling self._execute() 28983 1726883004.42522: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883004.42532: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883004.42641: variable 'omit' from source: magic vars 28983 1726883004.43015: variable 'ansible_distribution_major_version' from source: facts 28983 1726883004.43039: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883004.43048: variable 'omit' from source: magic vars 28983 1726883004.43120: variable 'omit' from source: magic vars 28983 1726883004.43263: variable 'profile' from source: play vars 28983 1726883004.43269: variable 'interface' from source: play vars 28983 1726883004.43370: variable 'interface' from source: play vars 28983 1726883004.43399: variable 'omit' from source: magic vars 28983 1726883004.43447: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883004.43503: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883004.43537: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883004.43557: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883004.43587: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883004.43637: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883004.43656: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883004.43661: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883004.43819: Set connection var ansible_connection to ssh 28983 1726883004.43832: Set connection var ansible_shell_executable to /bin/sh 28983 1726883004.43861: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883004.44140: Set connection var ansible_timeout to 10 28983 1726883004.44143: Set connection var ansible_pipelining to False 28983 1726883004.44147: Set connection var ansible_shell_type to sh 28983 1726883004.44149: variable 'ansible_shell_executable' from source: unknown 28983 1726883004.44152: variable 'ansible_connection' from source: unknown 28983 1726883004.44154: variable 'ansible_module_compression' from source: unknown 28983 1726883004.44157: variable 'ansible_shell_type' from source: unknown 28983 1726883004.44159: variable 'ansible_shell_executable' from source: unknown 28983 1726883004.44161: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883004.44163: variable 'ansible_pipelining' from source: unknown 28983 1726883004.44166: variable 'ansible_timeout' from source: unknown 28983 1726883004.44168: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883004.44382: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28983 1726883004.44387: variable 'omit' from source: magic vars 28983 1726883004.44389: starting attempt loop 28983 1726883004.44392: running the handler 28983 1726883004.44394: _low_level_execute_command(): starting 28983 1726883004.44396: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726883004.45522: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883004.45648: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883004.45660: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 28983 1726883004.45696: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883004.45801: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883004.47585: stdout chunk (state=3): >>>/root <<< 28983 1726883004.47794: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883004.47798: stdout chunk (state=3): >>><<< 28983 1726883004.47801: stderr chunk (state=3): >>><<< 28983 1726883004.47936: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883004.47941: _low_level_execute_command(): starting 28983 1726883004.47944: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726883004.4782867-30286-279033072797941 `" && echo ansible-tmp-1726883004.4782867-30286-279033072797941="` echo /root/.ansible/tmp/ansible-tmp-1726883004.4782867-30286-279033072797941 `" ) && sleep 0' 28983 1726883004.48530: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883004.48549: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883004.48602: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address <<< 28983 1726883004.48697: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883004.48741: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883004.48766: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883004.48801: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883004.48888: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883004.50918: stdout chunk (state=3): >>>ansible-tmp-1726883004.4782867-30286-279033072797941=/root/.ansible/tmp/ansible-tmp-1726883004.4782867-30286-279033072797941 <<< 28983 1726883004.51117: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883004.51120: stdout chunk (state=3): >>><<< 28983 1726883004.51123: stderr chunk (state=3): >>><<< 28983 1726883004.51345: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726883004.4782867-30286-279033072797941=/root/.ansible/tmp/ansible-tmp-1726883004.4782867-30286-279033072797941 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883004.51349: variable 'ansible_module_compression' from source: unknown 28983 1726883004.51351: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 28983 1726883004.51354: variable 'ansible_facts' from source: unknown 28983 1726883004.51400: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726883004.4782867-30286-279033072797941/AnsiballZ_stat.py 28983 1726883004.51588: Sending initial data 28983 1726883004.51598: Sent initial data (153 bytes) 28983 1726883004.52250: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883004.52346: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883004.52395: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883004.52418: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883004.52522: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883004.54175: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726883004.54266: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726883004.54344: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmp5rs20n7e /root/.ansible/tmp/ansible-tmp-1726883004.4782867-30286-279033072797941/AnsiballZ_stat.py <<< 28983 1726883004.54355: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726883004.4782867-30286-279033072797941/AnsiballZ_stat.py" <<< 28983 1726883004.54406: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmp5rs20n7e" to remote "/root/.ansible/tmp/ansible-tmp-1726883004.4782867-30286-279033072797941/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726883004.4782867-30286-279033072797941/AnsiballZ_stat.py" <<< 28983 1726883004.55733: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883004.55888: stderr chunk (state=3): >>><<< 28983 1726883004.55919: stdout chunk (state=3): >>><<< 28983 1726883004.55954: done transferring module to remote 28983 1726883004.55975: _low_level_execute_command(): starting 28983 1726883004.56023: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726883004.4782867-30286-279033072797941/ /root/.ansible/tmp/ansible-tmp-1726883004.4782867-30286-279033072797941/AnsiballZ_stat.py && sleep 0' 28983 1726883004.56683: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883004.56753: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883004.56828: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883004.56847: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883004.56868: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883004.57043: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883004.59140: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883004.59143: stdout chunk (state=3): >>><<< 28983 1726883004.59146: stderr chunk (state=3): >>><<< 28983 1726883004.59149: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883004.59151: _low_level_execute_command(): starting 28983 1726883004.59154: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726883004.4782867-30286-279033072797941/AnsiballZ_stat.py && sleep 0' 28983 1726883004.59906: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883004.59913: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883004.59928: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726883004.59937: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883004.59944: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address <<< 28983 1726883004.59951: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883004.59964: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883004.59984: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883004.60052: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883004.60075: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883004.60186: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883004.77292: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 28983 1726883004.78772: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. <<< 28983 1726883004.78832: stderr chunk (state=3): >>><<< 28983 1726883004.78838: stdout chunk (state=3): >>><<< 28983 1726883004.78855: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726883004.78885: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726883004.4782867-30286-279033072797941/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726883004.78895: _low_level_execute_command(): starting 28983 1726883004.78900: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726883004.4782867-30286-279033072797941/ > /dev/null 2>&1 && sleep 0' 28983 1726883004.79382: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883004.79386: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726883004.79389: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found <<< 28983 1726883004.79392: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883004.79458: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883004.79461: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883004.79536: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883004.81544: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883004.81548: stdout chunk (state=3): >>><<< 28983 1726883004.81550: stderr chunk (state=3): >>><<< 28983 1726883004.81569: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883004.81585: handler run complete 28983 1726883004.81739: attempt loop complete, returning result 28983 1726883004.81742: _execute() done 28983 1726883004.81745: dumping result to json 28983 1726883004.81747: done dumping result, returning 28983 1726883004.81749: done running TaskExecutor() for managed_node2/TASK: Stat profile file [0affe814-3a2d-b16d-c0a7-000000000947] 28983 1726883004.81752: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000947 28983 1726883004.81845: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000947 28983 1726883004.81850: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 28983 1726883004.81920: no more pending results, returning what we have 28983 1726883004.81924: results queue empty 28983 1726883004.81925: checking for any_errors_fatal 28983 1726883004.81936: done checking for any_errors_fatal 28983 1726883004.81937: checking for max_fail_percentage 28983 1726883004.81939: done checking for max_fail_percentage 28983 1726883004.81940: checking to see if all hosts have failed and the running result is not ok 28983 1726883004.81941: done checking to see if all hosts have failed 28983 1726883004.81942: getting the remaining hosts for this loop 28983 1726883004.81945: done getting the remaining hosts for this loop 28983 1726883004.81952: getting the next task for host managed_node2 28983 1726883004.81961: done getting next task for host managed_node2 28983 1726883004.81963: ^ task is: TASK: Set NM profile exist flag based on the profile files 28983 1726883004.81969: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883004.81976: getting variables 28983 1726883004.81978: in VariableManager get_vars() 28983 1726883004.82016: Calling all_inventory to load vars for managed_node2 28983 1726883004.82019: Calling groups_inventory to load vars for managed_node2 28983 1726883004.82023: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883004.82203: Calling all_plugins_play to load vars for managed_node2 28983 1726883004.82209: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883004.82215: Calling groups_plugins_play to load vars for managed_node2 28983 1726883004.83790: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883004.85563: done with get_vars() 28983 1726883004.85599: done getting variables 28983 1726883004.85683: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 21:43:24 -0400 (0:00:00.443) 0:00:34.854 ****** 28983 1726883004.85721: entering _queue_task() for managed_node2/set_fact 28983 1726883004.86141: worker is 1 (out of 1 available) 28983 1726883004.86153: exiting _queue_task() for managed_node2/set_fact 28983 1726883004.86166: done queuing things up, now waiting for results queue to drain 28983 1726883004.86168: waiting for pending results... 28983 1726883004.86389: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files 28983 1726883004.86523: in run() - task 0affe814-3a2d-b16d-c0a7-000000000948 28983 1726883004.86537: variable 'ansible_search_path' from source: unknown 28983 1726883004.86542: variable 'ansible_search_path' from source: unknown 28983 1726883004.86574: calling self._execute() 28983 1726883004.86662: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883004.86668: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883004.86683: variable 'omit' from source: magic vars 28983 1726883004.87009: variable 'ansible_distribution_major_version' from source: facts 28983 1726883004.87021: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883004.87131: variable 'profile_stat' from source: set_fact 28983 1726883004.87143: Evaluated conditional (profile_stat.stat.exists): False 28983 1726883004.87146: when evaluation is False, skipping this task 28983 1726883004.87150: _execute() done 28983 1726883004.87155: dumping result to json 28983 1726883004.87157: done dumping result, returning 28983 1726883004.87165: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files [0affe814-3a2d-b16d-c0a7-000000000948] 28983 1726883004.87177: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000948 28983 1726883004.87272: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000948 28983 1726883004.87276: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 28983 1726883004.87338: no more pending results, returning what we have 28983 1726883004.87342: results queue empty 28983 1726883004.87343: checking for any_errors_fatal 28983 1726883004.87351: done checking for any_errors_fatal 28983 1726883004.87352: checking for max_fail_percentage 28983 1726883004.87354: done checking for max_fail_percentage 28983 1726883004.87355: checking to see if all hosts have failed and the running result is not ok 28983 1726883004.87356: done checking to see if all hosts have failed 28983 1726883004.87357: getting the remaining hosts for this loop 28983 1726883004.87358: done getting the remaining hosts for this loop 28983 1726883004.87362: getting the next task for host managed_node2 28983 1726883004.87369: done getting next task for host managed_node2 28983 1726883004.87372: ^ task is: TASK: Get NM profile info 28983 1726883004.87378: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883004.87382: getting variables 28983 1726883004.87383: in VariableManager get_vars() 28983 1726883004.87414: Calling all_inventory to load vars for managed_node2 28983 1726883004.87417: Calling groups_inventory to load vars for managed_node2 28983 1726883004.87420: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883004.87429: Calling all_plugins_play to load vars for managed_node2 28983 1726883004.87433: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883004.87438: Calling groups_plugins_play to load vars for managed_node2 28983 1726883004.88642: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883004.90260: done with get_vars() 28983 1726883004.90287: done getting variables 28983 1726883004.90332: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 21:43:24 -0400 (0:00:00.046) 0:00:34.901 ****** 28983 1726883004.90361: entering _queue_task() for managed_node2/shell 28983 1726883004.90574: worker is 1 (out of 1 available) 28983 1726883004.90587: exiting _queue_task() for managed_node2/shell 28983 1726883004.90600: done queuing things up, now waiting for results queue to drain 28983 1726883004.90602: waiting for pending results... 28983 1726883004.90792: running TaskExecutor() for managed_node2/TASK: Get NM profile info 28983 1726883004.90893: in run() - task 0affe814-3a2d-b16d-c0a7-000000000949 28983 1726883004.90906: variable 'ansible_search_path' from source: unknown 28983 1726883004.90910: variable 'ansible_search_path' from source: unknown 28983 1726883004.90945: calling self._execute() 28983 1726883004.91025: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883004.91030: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883004.91043: variable 'omit' from source: magic vars 28983 1726883004.91355: variable 'ansible_distribution_major_version' from source: facts 28983 1726883004.91366: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883004.91376: variable 'omit' from source: magic vars 28983 1726883004.91422: variable 'omit' from source: magic vars 28983 1726883004.91509: variable 'profile' from source: play vars 28983 1726883004.91513: variable 'interface' from source: play vars 28983 1726883004.91570: variable 'interface' from source: play vars 28983 1726883004.91587: variable 'omit' from source: magic vars 28983 1726883004.91627: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883004.91660: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883004.91679: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883004.91696: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883004.91708: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883004.91738: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883004.91742: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883004.91747: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883004.91832: Set connection var ansible_connection to ssh 28983 1726883004.91841: Set connection var ansible_shell_executable to /bin/sh 28983 1726883004.91854: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883004.91861: Set connection var ansible_timeout to 10 28983 1726883004.91867: Set connection var ansible_pipelining to False 28983 1726883004.91870: Set connection var ansible_shell_type to sh 28983 1726883004.91891: variable 'ansible_shell_executable' from source: unknown 28983 1726883004.91894: variable 'ansible_connection' from source: unknown 28983 1726883004.91897: variable 'ansible_module_compression' from source: unknown 28983 1726883004.91901: variable 'ansible_shell_type' from source: unknown 28983 1726883004.91904: variable 'ansible_shell_executable' from source: unknown 28983 1726883004.91908: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883004.91913: variable 'ansible_pipelining' from source: unknown 28983 1726883004.91916: variable 'ansible_timeout' from source: unknown 28983 1726883004.91924: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883004.92040: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883004.92053: variable 'omit' from source: magic vars 28983 1726883004.92058: starting attempt loop 28983 1726883004.92061: running the handler 28983 1726883004.92076: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883004.92092: _low_level_execute_command(): starting 28983 1726883004.92099: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726883004.92613: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883004.92648: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration <<< 28983 1726883004.92651: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883004.92654: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883004.92706: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883004.92712: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883004.92791: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883004.94579: stdout chunk (state=3): >>>/root <<< 28983 1726883004.94696: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883004.94737: stderr chunk (state=3): >>><<< 28983 1726883004.94740: stdout chunk (state=3): >>><<< 28983 1726883004.94763: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883004.94778: _low_level_execute_command(): starting 28983 1726883004.94784: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726883004.947632-30312-179987304007786 `" && echo ansible-tmp-1726883004.947632-30312-179987304007786="` echo /root/.ansible/tmp/ansible-tmp-1726883004.947632-30312-179987304007786 `" ) && sleep 0' 28983 1726883004.95194: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883004.95233: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726883004.95237: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883004.95240: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883004.95244: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883004.95300: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883004.95307: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883004.95381: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883004.97400: stdout chunk (state=3): >>>ansible-tmp-1726883004.947632-30312-179987304007786=/root/.ansible/tmp/ansible-tmp-1726883004.947632-30312-179987304007786 <<< 28983 1726883004.97518: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883004.97562: stderr chunk (state=3): >>><<< 28983 1726883004.97566: stdout chunk (state=3): >>><<< 28983 1726883004.97583: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726883004.947632-30312-179987304007786=/root/.ansible/tmp/ansible-tmp-1726883004.947632-30312-179987304007786 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883004.97608: variable 'ansible_module_compression' from source: unknown 28983 1726883004.97652: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 28983 1726883004.97689: variable 'ansible_facts' from source: unknown 28983 1726883004.97742: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726883004.947632-30312-179987304007786/AnsiballZ_command.py 28983 1726883004.97847: Sending initial data 28983 1726883004.97851: Sent initial data (155 bytes) 28983 1726883004.98303: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883004.98306: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726883004.98309: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883004.98311: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883004.98365: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883004.98369: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883004.98436: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883005.00083: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 28983 1726883005.00087: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726883005.00152: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726883005.00219: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpd17riv56 /root/.ansible/tmp/ansible-tmp-1726883004.947632-30312-179987304007786/AnsiballZ_command.py <<< 28983 1726883005.00228: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726883004.947632-30312-179987304007786/AnsiballZ_command.py" <<< 28983 1726883005.00290: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpd17riv56" to remote "/root/.ansible/tmp/ansible-tmp-1726883004.947632-30312-179987304007786/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726883004.947632-30312-179987304007786/AnsiballZ_command.py" <<< 28983 1726883005.01206: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883005.01264: stderr chunk (state=3): >>><<< 28983 1726883005.01268: stdout chunk (state=3): >>><<< 28983 1726883005.01292: done transferring module to remote 28983 1726883005.01300: _low_level_execute_command(): starting 28983 1726883005.01306: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726883004.947632-30312-179987304007786/ /root/.ansible/tmp/ansible-tmp-1726883004.947632-30312-179987304007786/AnsiballZ_command.py && sleep 0' 28983 1726883005.01704: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883005.01739: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883005.01742: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883005.01745: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883005.01750: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883005.01803: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883005.01807: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883005.01877: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883005.03736: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883005.03778: stderr chunk (state=3): >>><<< 28983 1726883005.03781: stdout chunk (state=3): >>><<< 28983 1726883005.03794: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883005.03797: _low_level_execute_command(): starting 28983 1726883005.03804: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726883004.947632-30312-179987304007786/AnsiballZ_command.py && sleep 0' 28983 1726883005.04230: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883005.04235: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883005.04238: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883005.04241: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883005.04295: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883005.04302: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883005.04375: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883005.23261: stdout chunk (state=3): >>> {"changed": true, "stdout": "statebr /etc/NetworkManager/system-connections/statebr.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "start": "2024-09-20 21:43:25.212887", "end": "2024-09-20 21:43:25.231395", "delta": "0:00:00.018508", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 28983 1726883005.24882: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. <<< 28983 1726883005.24935: stderr chunk (state=3): >>><<< 28983 1726883005.24941: stdout chunk (state=3): >>><<< 28983 1726883005.24961: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "statebr /etc/NetworkManager/system-connections/statebr.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "start": "2024-09-20 21:43:25.212887", "end": "2024-09-20 21:43:25.231395", "delta": "0:00:00.018508", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726883005.24993: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726883004.947632-30312-179987304007786/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726883005.25002: _low_level_execute_command(): starting 28983 1726883005.25008: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726883004.947632-30312-179987304007786/ > /dev/null 2>&1 && sleep 0' 28983 1726883005.25475: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883005.25479: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726883005.25487: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address <<< 28983 1726883005.25490: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883005.25492: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883005.25542: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883005.25546: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883005.25625: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883005.27562: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883005.27602: stderr chunk (state=3): >>><<< 28983 1726883005.27606: stdout chunk (state=3): >>><<< 28983 1726883005.27619: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883005.27626: handler run complete 28983 1726883005.27653: Evaluated conditional (False): False 28983 1726883005.27666: attempt loop complete, returning result 28983 1726883005.27669: _execute() done 28983 1726883005.27674: dumping result to json 28983 1726883005.27682: done dumping result, returning 28983 1726883005.27691: done running TaskExecutor() for managed_node2/TASK: Get NM profile info [0affe814-3a2d-b16d-c0a7-000000000949] 28983 1726883005.27697: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000949 28983 1726883005.27805: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000949 28983 1726883005.27808: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "delta": "0:00:00.018508", "end": "2024-09-20 21:43:25.231395", "rc": 0, "start": "2024-09-20 21:43:25.212887" } STDOUT: statebr /etc/NetworkManager/system-connections/statebr.nmconnection 28983 1726883005.27893: no more pending results, returning what we have 28983 1726883005.27897: results queue empty 28983 1726883005.27898: checking for any_errors_fatal 28983 1726883005.27907: done checking for any_errors_fatal 28983 1726883005.27908: checking for max_fail_percentage 28983 1726883005.27910: done checking for max_fail_percentage 28983 1726883005.27911: checking to see if all hosts have failed and the running result is not ok 28983 1726883005.27912: done checking to see if all hosts have failed 28983 1726883005.27913: getting the remaining hosts for this loop 28983 1726883005.27915: done getting the remaining hosts for this loop 28983 1726883005.27919: getting the next task for host managed_node2 28983 1726883005.27928: done getting next task for host managed_node2 28983 1726883005.27931: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 28983 1726883005.27938: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883005.27943: getting variables 28983 1726883005.27944: in VariableManager get_vars() 28983 1726883005.27979: Calling all_inventory to load vars for managed_node2 28983 1726883005.27982: Calling groups_inventory to load vars for managed_node2 28983 1726883005.27986: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883005.27996: Calling all_plugins_play to load vars for managed_node2 28983 1726883005.27999: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883005.28002: Calling groups_plugins_play to load vars for managed_node2 28983 1726883005.29401: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883005.31002: done with get_vars() 28983 1726883005.31025: done getting variables 28983 1726883005.31076: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 21:43:25 -0400 (0:00:00.407) 0:00:35.308 ****** 28983 1726883005.31107: entering _queue_task() for managed_node2/set_fact 28983 1726883005.31367: worker is 1 (out of 1 available) 28983 1726883005.31382: exiting _queue_task() for managed_node2/set_fact 28983 1726883005.31396: done queuing things up, now waiting for results queue to drain 28983 1726883005.31398: waiting for pending results... 28983 1726883005.31594: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 28983 1726883005.31703: in run() - task 0affe814-3a2d-b16d-c0a7-00000000094a 28983 1726883005.31717: variable 'ansible_search_path' from source: unknown 28983 1726883005.31722: variable 'ansible_search_path' from source: unknown 28983 1726883005.31763: calling self._execute() 28983 1726883005.31856: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883005.31860: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883005.31869: variable 'omit' from source: magic vars 28983 1726883005.32192: variable 'ansible_distribution_major_version' from source: facts 28983 1726883005.32204: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883005.32319: variable 'nm_profile_exists' from source: set_fact 28983 1726883005.32331: Evaluated conditional (nm_profile_exists.rc == 0): True 28983 1726883005.32339: variable 'omit' from source: magic vars 28983 1726883005.32385: variable 'omit' from source: magic vars 28983 1726883005.32416: variable 'omit' from source: magic vars 28983 1726883005.32453: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883005.32487: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883005.32509: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883005.32526: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883005.32538: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883005.32566: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883005.32571: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883005.32577: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883005.32662: Set connection var ansible_connection to ssh 28983 1726883005.32672: Set connection var ansible_shell_executable to /bin/sh 28983 1726883005.32683: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883005.32692: Set connection var ansible_timeout to 10 28983 1726883005.32698: Set connection var ansible_pipelining to False 28983 1726883005.32701: Set connection var ansible_shell_type to sh 28983 1726883005.32723: variable 'ansible_shell_executable' from source: unknown 28983 1726883005.32728: variable 'ansible_connection' from source: unknown 28983 1726883005.32732: variable 'ansible_module_compression' from source: unknown 28983 1726883005.32734: variable 'ansible_shell_type' from source: unknown 28983 1726883005.32744: variable 'ansible_shell_executable' from source: unknown 28983 1726883005.32747: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883005.32749: variable 'ansible_pipelining' from source: unknown 28983 1726883005.32751: variable 'ansible_timeout' from source: unknown 28983 1726883005.32755: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883005.32881: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883005.32891: variable 'omit' from source: magic vars 28983 1726883005.32897: starting attempt loop 28983 1726883005.32900: running the handler 28983 1726883005.32914: handler run complete 28983 1726883005.32922: attempt loop complete, returning result 28983 1726883005.32925: _execute() done 28983 1726883005.32929: dumping result to json 28983 1726883005.32934: done dumping result, returning 28983 1726883005.32946: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0affe814-3a2d-b16d-c0a7-00000000094a] 28983 1726883005.32949: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000094a 28983 1726883005.33039: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000094a 28983 1726883005.33042: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 28983 1726883005.33108: no more pending results, returning what we have 28983 1726883005.33111: results queue empty 28983 1726883005.33112: checking for any_errors_fatal 28983 1726883005.33122: done checking for any_errors_fatal 28983 1726883005.33123: checking for max_fail_percentage 28983 1726883005.33125: done checking for max_fail_percentage 28983 1726883005.33125: checking to see if all hosts have failed and the running result is not ok 28983 1726883005.33126: done checking to see if all hosts have failed 28983 1726883005.33127: getting the remaining hosts for this loop 28983 1726883005.33130: done getting the remaining hosts for this loop 28983 1726883005.33137: getting the next task for host managed_node2 28983 1726883005.33148: done getting next task for host managed_node2 28983 1726883005.33151: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 28983 1726883005.33156: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883005.33160: getting variables 28983 1726883005.33162: in VariableManager get_vars() 28983 1726883005.33191: Calling all_inventory to load vars for managed_node2 28983 1726883005.33194: Calling groups_inventory to load vars for managed_node2 28983 1726883005.33197: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883005.33206: Calling all_plugins_play to load vars for managed_node2 28983 1726883005.33209: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883005.33213: Calling groups_plugins_play to load vars for managed_node2 28983 1726883005.34557: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883005.36155: done with get_vars() 28983 1726883005.36179: done getting variables 28983 1726883005.36225: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 28983 1726883005.36326: variable 'profile' from source: play vars 28983 1726883005.36329: variable 'interface' from source: play vars 28983 1726883005.36378: variable 'interface' from source: play vars TASK [Get the ansible_managed comment in ifcfg-statebr] ************************ task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 21:43:25 -0400 (0:00:00.052) 0:00:35.361 ****** 28983 1726883005.36408: entering _queue_task() for managed_node2/command 28983 1726883005.36631: worker is 1 (out of 1 available) 28983 1726883005.36646: exiting _queue_task() for managed_node2/command 28983 1726883005.36663: done queuing things up, now waiting for results queue to drain 28983 1726883005.36665: waiting for pending results... 28983 1726883005.36869: running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-statebr 28983 1726883005.36955: in run() - task 0affe814-3a2d-b16d-c0a7-00000000094c 28983 1726883005.36970: variable 'ansible_search_path' from source: unknown 28983 1726883005.36977: variable 'ansible_search_path' from source: unknown 28983 1726883005.37009: calling self._execute() 28983 1726883005.37093: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883005.37104: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883005.37113: variable 'omit' from source: magic vars 28983 1726883005.37425: variable 'ansible_distribution_major_version' from source: facts 28983 1726883005.37440: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883005.37545: variable 'profile_stat' from source: set_fact 28983 1726883005.37555: Evaluated conditional (profile_stat.stat.exists): False 28983 1726883005.37558: when evaluation is False, skipping this task 28983 1726883005.37561: _execute() done 28983 1726883005.37566: dumping result to json 28983 1726883005.37571: done dumping result, returning 28983 1726883005.37577: done running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-statebr [0affe814-3a2d-b16d-c0a7-00000000094c] 28983 1726883005.37584: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000094c 28983 1726883005.37682: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000094c 28983 1726883005.37685: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 28983 1726883005.37749: no more pending results, returning what we have 28983 1726883005.37753: results queue empty 28983 1726883005.37754: checking for any_errors_fatal 28983 1726883005.37760: done checking for any_errors_fatal 28983 1726883005.37761: checking for max_fail_percentage 28983 1726883005.37763: done checking for max_fail_percentage 28983 1726883005.37763: checking to see if all hosts have failed and the running result is not ok 28983 1726883005.37764: done checking to see if all hosts have failed 28983 1726883005.37765: getting the remaining hosts for this loop 28983 1726883005.37767: done getting the remaining hosts for this loop 28983 1726883005.37771: getting the next task for host managed_node2 28983 1726883005.37780: done getting next task for host managed_node2 28983 1726883005.37783: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 28983 1726883005.37788: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883005.37792: getting variables 28983 1726883005.37793: in VariableManager get_vars() 28983 1726883005.37830: Calling all_inventory to load vars for managed_node2 28983 1726883005.37836: Calling groups_inventory to load vars for managed_node2 28983 1726883005.37840: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883005.37848: Calling all_plugins_play to load vars for managed_node2 28983 1726883005.37850: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883005.37852: Calling groups_plugins_play to load vars for managed_node2 28983 1726883005.39058: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883005.40666: done with get_vars() 28983 1726883005.40689: done getting variables 28983 1726883005.40738: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 28983 1726883005.40820: variable 'profile' from source: play vars 28983 1726883005.40823: variable 'interface' from source: play vars 28983 1726883005.40876: variable 'interface' from source: play vars TASK [Verify the ansible_managed comment in ifcfg-statebr] ********************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 21:43:25 -0400 (0:00:00.044) 0:00:35.406 ****** 28983 1726883005.40903: entering _queue_task() for managed_node2/set_fact 28983 1726883005.41120: worker is 1 (out of 1 available) 28983 1726883005.41136: exiting _queue_task() for managed_node2/set_fact 28983 1726883005.41151: done queuing things up, now waiting for results queue to drain 28983 1726883005.41153: waiting for pending results... 28983 1726883005.41351: running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-statebr 28983 1726883005.41448: in run() - task 0affe814-3a2d-b16d-c0a7-00000000094d 28983 1726883005.41460: variable 'ansible_search_path' from source: unknown 28983 1726883005.41463: variable 'ansible_search_path' from source: unknown 28983 1726883005.41498: calling self._execute() 28983 1726883005.41581: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883005.41585: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883005.41598: variable 'omit' from source: magic vars 28983 1726883005.41910: variable 'ansible_distribution_major_version' from source: facts 28983 1726883005.41921: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883005.42031: variable 'profile_stat' from source: set_fact 28983 1726883005.42043: Evaluated conditional (profile_stat.stat.exists): False 28983 1726883005.42049: when evaluation is False, skipping this task 28983 1726883005.42052: _execute() done 28983 1726883005.42055: dumping result to json 28983 1726883005.42057: done dumping result, returning 28983 1726883005.42067: done running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-statebr [0affe814-3a2d-b16d-c0a7-00000000094d] 28983 1726883005.42070: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000094d 28983 1726883005.42169: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000094d 28983 1726883005.42173: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 28983 1726883005.42223: no more pending results, returning what we have 28983 1726883005.42227: results queue empty 28983 1726883005.42228: checking for any_errors_fatal 28983 1726883005.42233: done checking for any_errors_fatal 28983 1726883005.42238: checking for max_fail_percentage 28983 1726883005.42240: done checking for max_fail_percentage 28983 1726883005.42241: checking to see if all hosts have failed and the running result is not ok 28983 1726883005.42242: done checking to see if all hosts have failed 28983 1726883005.42242: getting the remaining hosts for this loop 28983 1726883005.42244: done getting the remaining hosts for this loop 28983 1726883005.42248: getting the next task for host managed_node2 28983 1726883005.42255: done getting next task for host managed_node2 28983 1726883005.42258: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 28983 1726883005.42264: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883005.42268: getting variables 28983 1726883005.42269: in VariableManager get_vars() 28983 1726883005.42297: Calling all_inventory to load vars for managed_node2 28983 1726883005.42299: Calling groups_inventory to load vars for managed_node2 28983 1726883005.42303: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883005.42312: Calling all_plugins_play to load vars for managed_node2 28983 1726883005.42316: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883005.42319: Calling groups_plugins_play to load vars for managed_node2 28983 1726883005.43628: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883005.45227: done with get_vars() 28983 1726883005.45250: done getting variables 28983 1726883005.45301: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 28983 1726883005.45385: variable 'profile' from source: play vars 28983 1726883005.45389: variable 'interface' from source: play vars 28983 1726883005.45433: variable 'interface' from source: play vars TASK [Get the fingerprint comment in ifcfg-statebr] **************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 21:43:25 -0400 (0:00:00.045) 0:00:35.452 ****** 28983 1726883005.45459: entering _queue_task() for managed_node2/command 28983 1726883005.45675: worker is 1 (out of 1 available) 28983 1726883005.45689: exiting _queue_task() for managed_node2/command 28983 1726883005.45703: done queuing things up, now waiting for results queue to drain 28983 1726883005.45705: waiting for pending results... 28983 1726883005.45891: running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-statebr 28983 1726883005.45988: in run() - task 0affe814-3a2d-b16d-c0a7-00000000094e 28983 1726883005.46001: variable 'ansible_search_path' from source: unknown 28983 1726883005.46004: variable 'ansible_search_path' from source: unknown 28983 1726883005.46033: calling self._execute() 28983 1726883005.46117: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883005.46124: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883005.46136: variable 'omit' from source: magic vars 28983 1726883005.46445: variable 'ansible_distribution_major_version' from source: facts 28983 1726883005.46455: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883005.46561: variable 'profile_stat' from source: set_fact 28983 1726883005.46575: Evaluated conditional (profile_stat.stat.exists): False 28983 1726883005.46578: when evaluation is False, skipping this task 28983 1726883005.46581: _execute() done 28983 1726883005.46586: dumping result to json 28983 1726883005.46589: done dumping result, returning 28983 1726883005.46592: done running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-statebr [0affe814-3a2d-b16d-c0a7-00000000094e] 28983 1726883005.46606: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000094e 28983 1726883005.46694: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000094e 28983 1726883005.46697: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 28983 1726883005.46764: no more pending results, returning what we have 28983 1726883005.46769: results queue empty 28983 1726883005.46770: checking for any_errors_fatal 28983 1726883005.46778: done checking for any_errors_fatal 28983 1726883005.46779: checking for max_fail_percentage 28983 1726883005.46781: done checking for max_fail_percentage 28983 1726883005.46782: checking to see if all hosts have failed and the running result is not ok 28983 1726883005.46783: done checking to see if all hosts have failed 28983 1726883005.46784: getting the remaining hosts for this loop 28983 1726883005.46786: done getting the remaining hosts for this loop 28983 1726883005.46789: getting the next task for host managed_node2 28983 1726883005.46797: done getting next task for host managed_node2 28983 1726883005.46799: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 28983 1726883005.46804: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883005.46807: getting variables 28983 1726883005.46808: in VariableManager get_vars() 28983 1726883005.46842: Calling all_inventory to load vars for managed_node2 28983 1726883005.46845: Calling groups_inventory to load vars for managed_node2 28983 1726883005.46848: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883005.46855: Calling all_plugins_play to load vars for managed_node2 28983 1726883005.46858: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883005.46860: Calling groups_plugins_play to load vars for managed_node2 28983 1726883005.51513: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883005.53109: done with get_vars() 28983 1726883005.53131: done getting variables 28983 1726883005.53178: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 28983 1726883005.53251: variable 'profile' from source: play vars 28983 1726883005.53254: variable 'interface' from source: play vars 28983 1726883005.53305: variable 'interface' from source: play vars TASK [Verify the fingerprint comment in ifcfg-statebr] ************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 21:43:25 -0400 (0:00:00.078) 0:00:35.531 ****** 28983 1726883005.53327: entering _queue_task() for managed_node2/set_fact 28983 1726883005.53595: worker is 1 (out of 1 available) 28983 1726883005.53608: exiting _queue_task() for managed_node2/set_fact 28983 1726883005.53621: done queuing things up, now waiting for results queue to drain 28983 1726883005.53623: waiting for pending results... 28983 1726883005.53814: running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-statebr 28983 1726883005.53930: in run() - task 0affe814-3a2d-b16d-c0a7-00000000094f 28983 1726883005.53942: variable 'ansible_search_path' from source: unknown 28983 1726883005.53946: variable 'ansible_search_path' from source: unknown 28983 1726883005.53983: calling self._execute() 28983 1726883005.54061: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883005.54071: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883005.54088: variable 'omit' from source: magic vars 28983 1726883005.54388: variable 'ansible_distribution_major_version' from source: facts 28983 1726883005.54398: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883005.54504: variable 'profile_stat' from source: set_fact 28983 1726883005.54521: Evaluated conditional (profile_stat.stat.exists): False 28983 1726883005.54526: when evaluation is False, skipping this task 28983 1726883005.54530: _execute() done 28983 1726883005.54535: dumping result to json 28983 1726883005.54539: done dumping result, returning 28983 1726883005.54542: done running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-statebr [0affe814-3a2d-b16d-c0a7-00000000094f] 28983 1726883005.54545: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000094f 28983 1726883005.54644: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000094f 28983 1726883005.54646: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 28983 1726883005.54707: no more pending results, returning what we have 28983 1726883005.54711: results queue empty 28983 1726883005.54712: checking for any_errors_fatal 28983 1726883005.54720: done checking for any_errors_fatal 28983 1726883005.54721: checking for max_fail_percentage 28983 1726883005.54722: done checking for max_fail_percentage 28983 1726883005.54723: checking to see if all hosts have failed and the running result is not ok 28983 1726883005.54724: done checking to see if all hosts have failed 28983 1726883005.54725: getting the remaining hosts for this loop 28983 1726883005.54727: done getting the remaining hosts for this loop 28983 1726883005.54732: getting the next task for host managed_node2 28983 1726883005.54744: done getting next task for host managed_node2 28983 1726883005.54747: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 28983 1726883005.54751: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883005.54757: getting variables 28983 1726883005.54759: in VariableManager get_vars() 28983 1726883005.54789: Calling all_inventory to load vars for managed_node2 28983 1726883005.54792: Calling groups_inventory to load vars for managed_node2 28983 1726883005.54795: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883005.54805: Calling all_plugins_play to load vars for managed_node2 28983 1726883005.54808: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883005.54811: Calling groups_plugins_play to load vars for managed_node2 28983 1726883005.56019: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883005.57612: done with get_vars() 28983 1726883005.57633: done getting variables 28983 1726883005.57686: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 28983 1726883005.57773: variable 'profile' from source: play vars 28983 1726883005.57777: variable 'interface' from source: play vars 28983 1726883005.57823: variable 'interface' from source: play vars TASK [Assert that the profile is present - 'statebr'] ************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Friday 20 September 2024 21:43:25 -0400 (0:00:00.045) 0:00:35.576 ****** 28983 1726883005.57850: entering _queue_task() for managed_node2/assert 28983 1726883005.58070: worker is 1 (out of 1 available) 28983 1726883005.58083: exiting _queue_task() for managed_node2/assert 28983 1726883005.58097: done queuing things up, now waiting for results queue to drain 28983 1726883005.58099: waiting for pending results... 28983 1726883005.58296: running TaskExecutor() for managed_node2/TASK: Assert that the profile is present - 'statebr' 28983 1726883005.58393: in run() - task 0affe814-3a2d-b16d-c0a7-0000000008ae 28983 1726883005.58406: variable 'ansible_search_path' from source: unknown 28983 1726883005.58410: variable 'ansible_search_path' from source: unknown 28983 1726883005.58449: calling self._execute() 28983 1726883005.58528: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883005.58537: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883005.58554: variable 'omit' from source: magic vars 28983 1726883005.58859: variable 'ansible_distribution_major_version' from source: facts 28983 1726883005.58870: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883005.58884: variable 'omit' from source: magic vars 28983 1726883005.58919: variable 'omit' from source: magic vars 28983 1726883005.59006: variable 'profile' from source: play vars 28983 1726883005.59012: variable 'interface' from source: play vars 28983 1726883005.59065: variable 'interface' from source: play vars 28983 1726883005.59085: variable 'omit' from source: magic vars 28983 1726883005.59124: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883005.59156: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883005.59178: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883005.59196: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883005.59207: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883005.59236: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883005.59240: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883005.59245: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883005.59330: Set connection var ansible_connection to ssh 28983 1726883005.59344: Set connection var ansible_shell_executable to /bin/sh 28983 1726883005.59353: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883005.59361: Set connection var ansible_timeout to 10 28983 1726883005.59368: Set connection var ansible_pipelining to False 28983 1726883005.59370: Set connection var ansible_shell_type to sh 28983 1726883005.59394: variable 'ansible_shell_executable' from source: unknown 28983 1726883005.59397: variable 'ansible_connection' from source: unknown 28983 1726883005.59399: variable 'ansible_module_compression' from source: unknown 28983 1726883005.59403: variable 'ansible_shell_type' from source: unknown 28983 1726883005.59406: variable 'ansible_shell_executable' from source: unknown 28983 1726883005.59411: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883005.59418: variable 'ansible_pipelining' from source: unknown 28983 1726883005.59421: variable 'ansible_timeout' from source: unknown 28983 1726883005.59425: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883005.59543: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883005.59556: variable 'omit' from source: magic vars 28983 1726883005.59561: starting attempt loop 28983 1726883005.59564: running the handler 28983 1726883005.59663: variable 'lsr_net_profile_exists' from source: set_fact 28983 1726883005.59667: Evaluated conditional (lsr_net_profile_exists): True 28983 1726883005.59669: handler run complete 28983 1726883005.59688: attempt loop complete, returning result 28983 1726883005.59691: _execute() done 28983 1726883005.59693: dumping result to json 28983 1726883005.59699: done dumping result, returning 28983 1726883005.59705: done running TaskExecutor() for managed_node2/TASK: Assert that the profile is present - 'statebr' [0affe814-3a2d-b16d-c0a7-0000000008ae] 28983 1726883005.59711: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000008ae 28983 1726883005.59807: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000008ae 28983 1726883005.59810: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 28983 1726883005.59862: no more pending results, returning what we have 28983 1726883005.59866: results queue empty 28983 1726883005.59867: checking for any_errors_fatal 28983 1726883005.59872: done checking for any_errors_fatal 28983 1726883005.59873: checking for max_fail_percentage 28983 1726883005.59875: done checking for max_fail_percentage 28983 1726883005.59876: checking to see if all hosts have failed and the running result is not ok 28983 1726883005.59877: done checking to see if all hosts have failed 28983 1726883005.59878: getting the remaining hosts for this loop 28983 1726883005.59880: done getting the remaining hosts for this loop 28983 1726883005.59884: getting the next task for host managed_node2 28983 1726883005.59890: done getting next task for host managed_node2 28983 1726883005.59893: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 28983 1726883005.59897: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883005.59901: getting variables 28983 1726883005.59902: in VariableManager get_vars() 28983 1726883005.59930: Calling all_inventory to load vars for managed_node2 28983 1726883005.59933: Calling groups_inventory to load vars for managed_node2 28983 1726883005.59938: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883005.59947: Calling all_plugins_play to load vars for managed_node2 28983 1726883005.59950: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883005.59953: Calling groups_plugins_play to load vars for managed_node2 28983 1726883005.61314: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883005.62912: done with get_vars() 28983 1726883005.62939: done getting variables 28983 1726883005.62985: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 28983 1726883005.63071: variable 'profile' from source: play vars 28983 1726883005.63076: variable 'interface' from source: play vars 28983 1726883005.63120: variable 'interface' from source: play vars TASK [Assert that the ansible managed comment is present in 'statebr'] ********* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Friday 20 September 2024 21:43:25 -0400 (0:00:00.052) 0:00:35.629 ****** 28983 1726883005.63152: entering _queue_task() for managed_node2/assert 28983 1726883005.63356: worker is 1 (out of 1 available) 28983 1726883005.63370: exiting _queue_task() for managed_node2/assert 28983 1726883005.63385: done queuing things up, now waiting for results queue to drain 28983 1726883005.63387: waiting for pending results... 28983 1726883005.63577: running TaskExecutor() for managed_node2/TASK: Assert that the ansible managed comment is present in 'statebr' 28983 1726883005.63669: in run() - task 0affe814-3a2d-b16d-c0a7-0000000008af 28983 1726883005.63684: variable 'ansible_search_path' from source: unknown 28983 1726883005.63688: variable 'ansible_search_path' from source: unknown 28983 1726883005.63720: calling self._execute() 28983 1726883005.63799: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883005.63803: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883005.63815: variable 'omit' from source: magic vars 28983 1726883005.64122: variable 'ansible_distribution_major_version' from source: facts 28983 1726883005.64132: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883005.64140: variable 'omit' from source: magic vars 28983 1726883005.64188: variable 'omit' from source: magic vars 28983 1726883005.64270: variable 'profile' from source: play vars 28983 1726883005.64278: variable 'interface' from source: play vars 28983 1726883005.64330: variable 'interface' from source: play vars 28983 1726883005.64348: variable 'omit' from source: magic vars 28983 1726883005.64385: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883005.64417: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883005.64436: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883005.64451: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883005.64462: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883005.64491: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883005.64494: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883005.64499: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883005.64581: Set connection var ansible_connection to ssh 28983 1726883005.64593: Set connection var ansible_shell_executable to /bin/sh 28983 1726883005.64602: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883005.64615: Set connection var ansible_timeout to 10 28983 1726883005.64620: Set connection var ansible_pipelining to False 28983 1726883005.64623: Set connection var ansible_shell_type to sh 28983 1726883005.64644: variable 'ansible_shell_executable' from source: unknown 28983 1726883005.64647: variable 'ansible_connection' from source: unknown 28983 1726883005.64649: variable 'ansible_module_compression' from source: unknown 28983 1726883005.64652: variable 'ansible_shell_type' from source: unknown 28983 1726883005.64657: variable 'ansible_shell_executable' from source: unknown 28983 1726883005.64660: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883005.64665: variable 'ansible_pipelining' from source: unknown 28983 1726883005.64668: variable 'ansible_timeout' from source: unknown 28983 1726883005.64676: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883005.64792: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883005.64803: variable 'omit' from source: magic vars 28983 1726883005.64808: starting attempt loop 28983 1726883005.64813: running the handler 28983 1726883005.64908: variable 'lsr_net_profile_ansible_managed' from source: set_fact 28983 1726883005.64911: Evaluated conditional (lsr_net_profile_ansible_managed): True 28983 1726883005.64919: handler run complete 28983 1726883005.64936: attempt loop complete, returning result 28983 1726883005.64940: _execute() done 28983 1726883005.64942: dumping result to json 28983 1726883005.64954: done dumping result, returning 28983 1726883005.64957: done running TaskExecutor() for managed_node2/TASK: Assert that the ansible managed comment is present in 'statebr' [0affe814-3a2d-b16d-c0a7-0000000008af] 28983 1726883005.64960: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000008af 28983 1726883005.65055: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000008af 28983 1726883005.65059: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 28983 1726883005.65118: no more pending results, returning what we have 28983 1726883005.65121: results queue empty 28983 1726883005.65122: checking for any_errors_fatal 28983 1726883005.65127: done checking for any_errors_fatal 28983 1726883005.65128: checking for max_fail_percentage 28983 1726883005.65130: done checking for max_fail_percentage 28983 1726883005.65131: checking to see if all hosts have failed and the running result is not ok 28983 1726883005.65132: done checking to see if all hosts have failed 28983 1726883005.65132: getting the remaining hosts for this loop 28983 1726883005.65136: done getting the remaining hosts for this loop 28983 1726883005.65140: getting the next task for host managed_node2 28983 1726883005.65147: done getting next task for host managed_node2 28983 1726883005.65150: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 28983 1726883005.65154: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883005.65158: getting variables 28983 1726883005.65159: in VariableManager get_vars() 28983 1726883005.65199: Calling all_inventory to load vars for managed_node2 28983 1726883005.65202: Calling groups_inventory to load vars for managed_node2 28983 1726883005.65206: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883005.65214: Calling all_plugins_play to load vars for managed_node2 28983 1726883005.65217: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883005.65219: Calling groups_plugins_play to load vars for managed_node2 28983 1726883005.66523: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883005.68139: done with get_vars() 28983 1726883005.68160: done getting variables 28983 1726883005.68204: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 28983 1726883005.68289: variable 'profile' from source: play vars 28983 1726883005.68292: variable 'interface' from source: play vars 28983 1726883005.68345: variable 'interface' from source: play vars TASK [Assert that the fingerprint comment is present in statebr] *************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Friday 20 September 2024 21:43:25 -0400 (0:00:00.052) 0:00:35.681 ****** 28983 1726883005.68369: entering _queue_task() for managed_node2/assert 28983 1726883005.68568: worker is 1 (out of 1 available) 28983 1726883005.68585: exiting _queue_task() for managed_node2/assert 28983 1726883005.68599: done queuing things up, now waiting for results queue to drain 28983 1726883005.68601: waiting for pending results... 28983 1726883005.68782: running TaskExecutor() for managed_node2/TASK: Assert that the fingerprint comment is present in statebr 28983 1726883005.68883: in run() - task 0affe814-3a2d-b16d-c0a7-0000000008b0 28983 1726883005.68896: variable 'ansible_search_path' from source: unknown 28983 1726883005.68900: variable 'ansible_search_path' from source: unknown 28983 1726883005.68933: calling self._execute() 28983 1726883005.69019: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883005.69024: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883005.69035: variable 'omit' from source: magic vars 28983 1726883005.69349: variable 'ansible_distribution_major_version' from source: facts 28983 1726883005.69359: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883005.69367: variable 'omit' from source: magic vars 28983 1726883005.69409: variable 'omit' from source: magic vars 28983 1726883005.69489: variable 'profile' from source: play vars 28983 1726883005.69495: variable 'interface' from source: play vars 28983 1726883005.69551: variable 'interface' from source: play vars 28983 1726883005.69566: variable 'omit' from source: magic vars 28983 1726883005.69606: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883005.69639: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883005.69657: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883005.69672: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883005.69686: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883005.69715: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883005.69721: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883005.69724: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883005.69807: Set connection var ansible_connection to ssh 28983 1726883005.69818: Set connection var ansible_shell_executable to /bin/sh 28983 1726883005.69828: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883005.69838: Set connection var ansible_timeout to 10 28983 1726883005.69847: Set connection var ansible_pipelining to False 28983 1726883005.69850: Set connection var ansible_shell_type to sh 28983 1726883005.69868: variable 'ansible_shell_executable' from source: unknown 28983 1726883005.69871: variable 'ansible_connection' from source: unknown 28983 1726883005.69877: variable 'ansible_module_compression' from source: unknown 28983 1726883005.69880: variable 'ansible_shell_type' from source: unknown 28983 1726883005.69884: variable 'ansible_shell_executable' from source: unknown 28983 1726883005.69887: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883005.69893: variable 'ansible_pipelining' from source: unknown 28983 1726883005.69896: variable 'ansible_timeout' from source: unknown 28983 1726883005.69901: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883005.70020: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883005.70035: variable 'omit' from source: magic vars 28983 1726883005.70040: starting attempt loop 28983 1726883005.70043: running the handler 28983 1726883005.70136: variable 'lsr_net_profile_fingerprint' from source: set_fact 28983 1726883005.70140: Evaluated conditional (lsr_net_profile_fingerprint): True 28983 1726883005.70148: handler run complete 28983 1726883005.70164: attempt loop complete, returning result 28983 1726883005.70167: _execute() done 28983 1726883005.70170: dumping result to json 28983 1726883005.70172: done dumping result, returning 28983 1726883005.70185: done running TaskExecutor() for managed_node2/TASK: Assert that the fingerprint comment is present in statebr [0affe814-3a2d-b16d-c0a7-0000000008b0] 28983 1726883005.70190: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000008b0 28983 1726883005.70282: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000008b0 28983 1726883005.70287: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 28983 1726883005.70338: no more pending results, returning what we have 28983 1726883005.70341: results queue empty 28983 1726883005.70342: checking for any_errors_fatal 28983 1726883005.70348: done checking for any_errors_fatal 28983 1726883005.70349: checking for max_fail_percentage 28983 1726883005.70351: done checking for max_fail_percentage 28983 1726883005.70352: checking to see if all hosts have failed and the running result is not ok 28983 1726883005.70353: done checking to see if all hosts have failed 28983 1726883005.70354: getting the remaining hosts for this loop 28983 1726883005.70355: done getting the remaining hosts for this loop 28983 1726883005.70359: getting the next task for host managed_node2 28983 1726883005.70368: done getting next task for host managed_node2 28983 1726883005.70371: ^ task is: TASK: Conditional asserts 28983 1726883005.70374: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883005.70378: getting variables 28983 1726883005.70379: in VariableManager get_vars() 28983 1726883005.70408: Calling all_inventory to load vars for managed_node2 28983 1726883005.70411: Calling groups_inventory to load vars for managed_node2 28983 1726883005.70414: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883005.70423: Calling all_plugins_play to load vars for managed_node2 28983 1726883005.70426: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883005.70430: Calling groups_plugins_play to load vars for managed_node2 28983 1726883005.71643: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883005.73262: done with get_vars() 28983 1726883005.73291: done getting variables TASK [Conditional asserts] ***************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:42 Friday 20 September 2024 21:43:25 -0400 (0:00:00.049) 0:00:35.731 ****** 28983 1726883005.73359: entering _queue_task() for managed_node2/include_tasks 28983 1726883005.73556: worker is 1 (out of 1 available) 28983 1726883005.73570: exiting _queue_task() for managed_node2/include_tasks 28983 1726883005.73584: done queuing things up, now waiting for results queue to drain 28983 1726883005.73586: waiting for pending results... 28983 1726883005.73768: running TaskExecutor() for managed_node2/TASK: Conditional asserts 28983 1726883005.73852: in run() - task 0affe814-3a2d-b16d-c0a7-0000000005ba 28983 1726883005.73865: variable 'ansible_search_path' from source: unknown 28983 1726883005.73868: variable 'ansible_search_path' from source: unknown 28983 1726883005.74096: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883005.76087: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883005.76144: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883005.76175: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883005.76207: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883005.76232: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883005.76304: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883005.76329: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883005.76356: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883005.76391: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883005.76403: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883005.76523: dumping result to json 28983 1726883005.76528: done dumping result, returning 28983 1726883005.76532: done running TaskExecutor() for managed_node2/TASK: Conditional asserts [0affe814-3a2d-b16d-c0a7-0000000005ba] 28983 1726883005.76543: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000005ba 28983 1726883005.76651: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000005ba 28983 1726883005.76655: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } 28983 1726883005.76706: no more pending results, returning what we have 28983 1726883005.76710: results queue empty 28983 1726883005.76711: checking for any_errors_fatal 28983 1726883005.76717: done checking for any_errors_fatal 28983 1726883005.76718: checking for max_fail_percentage 28983 1726883005.76720: done checking for max_fail_percentage 28983 1726883005.76721: checking to see if all hosts have failed and the running result is not ok 28983 1726883005.76722: done checking to see if all hosts have failed 28983 1726883005.76723: getting the remaining hosts for this loop 28983 1726883005.76725: done getting the remaining hosts for this loop 28983 1726883005.76729: getting the next task for host managed_node2 28983 1726883005.76737: done getting next task for host managed_node2 28983 1726883005.76739: ^ task is: TASK: Success in test '{{ lsr_description }}' 28983 1726883005.76743: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883005.76746: getting variables 28983 1726883005.76748: in VariableManager get_vars() 28983 1726883005.76776: Calling all_inventory to load vars for managed_node2 28983 1726883005.76779: Calling groups_inventory to load vars for managed_node2 28983 1726883005.76782: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883005.76791: Calling all_plugins_play to load vars for managed_node2 28983 1726883005.76794: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883005.76798: Calling groups_plugins_play to load vars for managed_node2 28983 1726883005.78153: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883005.79801: done with get_vars() 28983 1726883005.79822: done getting variables 28983 1726883005.79867: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 28983 1726883005.79957: variable 'lsr_description' from source: include params TASK [Success in test 'I can create a profile without autoconnect'] ************ task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:47 Friday 20 September 2024 21:43:25 -0400 (0:00:00.066) 0:00:35.797 ****** 28983 1726883005.79980: entering _queue_task() for managed_node2/debug 28983 1726883005.80181: worker is 1 (out of 1 available) 28983 1726883005.80195: exiting _queue_task() for managed_node2/debug 28983 1726883005.80206: done queuing things up, now waiting for results queue to drain 28983 1726883005.80208: waiting for pending results... 28983 1726883005.80402: running TaskExecutor() for managed_node2/TASK: Success in test 'I can create a profile without autoconnect' 28983 1726883005.80493: in run() - task 0affe814-3a2d-b16d-c0a7-0000000005bb 28983 1726883005.80506: variable 'ansible_search_path' from source: unknown 28983 1726883005.80510: variable 'ansible_search_path' from source: unknown 28983 1726883005.80544: calling self._execute() 28983 1726883005.80627: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883005.80633: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883005.80647: variable 'omit' from source: magic vars 28983 1726883005.80962: variable 'ansible_distribution_major_version' from source: facts 28983 1726883005.80973: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883005.80983: variable 'omit' from source: magic vars 28983 1726883005.81021: variable 'omit' from source: magic vars 28983 1726883005.81130: variable 'lsr_description' from source: include params 28983 1726883005.81228: variable 'omit' from source: magic vars 28983 1726883005.81233: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883005.81237: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883005.81255: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883005.81280: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883005.81292: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883005.81540: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883005.81543: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883005.81546: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883005.81549: Set connection var ansible_connection to ssh 28983 1726883005.81551: Set connection var ansible_shell_executable to /bin/sh 28983 1726883005.81554: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883005.81556: Set connection var ansible_timeout to 10 28983 1726883005.81558: Set connection var ansible_pipelining to False 28983 1726883005.81560: Set connection var ansible_shell_type to sh 28983 1726883005.81563: variable 'ansible_shell_executable' from source: unknown 28983 1726883005.81565: variable 'ansible_connection' from source: unknown 28983 1726883005.81568: variable 'ansible_module_compression' from source: unknown 28983 1726883005.81570: variable 'ansible_shell_type' from source: unknown 28983 1726883005.81572: variable 'ansible_shell_executable' from source: unknown 28983 1726883005.81574: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883005.81576: variable 'ansible_pipelining' from source: unknown 28983 1726883005.81579: variable 'ansible_timeout' from source: unknown 28983 1726883005.81581: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883005.81812: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883005.81816: variable 'omit' from source: magic vars 28983 1726883005.81819: starting attempt loop 28983 1726883005.81826: running the handler 28983 1726883005.81830: handler run complete 28983 1726883005.81832: attempt loop complete, returning result 28983 1726883005.81836: _execute() done 28983 1726883005.81839: dumping result to json 28983 1726883005.81841: done dumping result, returning 28983 1726883005.81844: done running TaskExecutor() for managed_node2/TASK: Success in test 'I can create a profile without autoconnect' [0affe814-3a2d-b16d-c0a7-0000000005bb] 28983 1726883005.81846: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000005bb 28983 1726883005.82058: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000005bb 28983 1726883005.82062: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: +++++ Success in test 'I can create a profile without autoconnect' +++++ 28983 1726883005.82105: no more pending results, returning what we have 28983 1726883005.82108: results queue empty 28983 1726883005.82109: checking for any_errors_fatal 28983 1726883005.82114: done checking for any_errors_fatal 28983 1726883005.82115: checking for max_fail_percentage 28983 1726883005.82117: done checking for max_fail_percentage 28983 1726883005.82118: checking to see if all hosts have failed and the running result is not ok 28983 1726883005.82119: done checking to see if all hosts have failed 28983 1726883005.82120: getting the remaining hosts for this loop 28983 1726883005.82122: done getting the remaining hosts for this loop 28983 1726883005.82126: getting the next task for host managed_node2 28983 1726883005.82133: done getting next task for host managed_node2 28983 1726883005.82138: ^ task is: TASK: Cleanup 28983 1726883005.82141: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883005.82150: getting variables 28983 1726883005.82152: in VariableManager get_vars() 28983 1726883005.82190: Calling all_inventory to load vars for managed_node2 28983 1726883005.82193: Calling groups_inventory to load vars for managed_node2 28983 1726883005.82197: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883005.82206: Calling all_plugins_play to load vars for managed_node2 28983 1726883005.82210: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883005.82213: Calling groups_plugins_play to load vars for managed_node2 28983 1726883005.83964: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883005.85577: done with get_vars() 28983 1726883005.85599: done getting variables TASK [Cleanup] ***************************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:66 Friday 20 September 2024 21:43:25 -0400 (0:00:00.056) 0:00:35.854 ****** 28983 1726883005.85676: entering _queue_task() for managed_node2/include_tasks 28983 1726883005.85926: worker is 1 (out of 1 available) 28983 1726883005.85941: exiting _queue_task() for managed_node2/include_tasks 28983 1726883005.85956: done queuing things up, now waiting for results queue to drain 28983 1726883005.85958: waiting for pending results... 28983 1726883005.86354: running TaskExecutor() for managed_node2/TASK: Cleanup 28983 1726883005.86389: in run() - task 0affe814-3a2d-b16d-c0a7-0000000005bf 28983 1726883005.86409: variable 'ansible_search_path' from source: unknown 28983 1726883005.86418: variable 'ansible_search_path' from source: unknown 28983 1726883005.86475: variable 'lsr_cleanup' from source: include params 28983 1726883005.86702: variable 'lsr_cleanup' from source: include params 28983 1726883005.86784: variable 'omit' from source: magic vars 28983 1726883005.86947: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883005.86968: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883005.86987: variable 'omit' from source: magic vars 28983 1726883005.87294: variable 'ansible_distribution_major_version' from source: facts 28983 1726883005.87311: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883005.87327: variable 'item' from source: unknown 28983 1726883005.87407: variable 'item' from source: unknown 28983 1726883005.87458: variable 'item' from source: unknown 28983 1726883005.87540: variable 'item' from source: unknown 28983 1726883005.87878: dumping result to json 28983 1726883005.87882: done dumping result, returning 28983 1726883005.87885: done running TaskExecutor() for managed_node2/TASK: Cleanup [0affe814-3a2d-b16d-c0a7-0000000005bf] 28983 1726883005.87888: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000005bf 28983 1726883005.87929: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000005bf 28983 1726883005.87931: WORKER PROCESS EXITING 28983 1726883005.87958: no more pending results, returning what we have 28983 1726883005.87963: in VariableManager get_vars() 28983 1726883005.88001: Calling all_inventory to load vars for managed_node2 28983 1726883005.88004: Calling groups_inventory to load vars for managed_node2 28983 1726883005.88008: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883005.88020: Calling all_plugins_play to load vars for managed_node2 28983 1726883005.88024: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883005.88028: Calling groups_plugins_play to load vars for managed_node2 28983 1726883005.90253: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883005.93110: done with get_vars() 28983 1726883005.93148: variable 'ansible_search_path' from source: unknown 28983 1726883005.93150: variable 'ansible_search_path' from source: unknown 28983 1726883005.93196: we have included files to process 28983 1726883005.93198: generating all_blocks data 28983 1726883005.93200: done generating all_blocks data 28983 1726883005.93208: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 28983 1726883005.93209: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 28983 1726883005.93212: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 28983 1726883005.93452: done processing included file 28983 1726883005.93455: iterating over new_blocks loaded from include file 28983 1726883005.93457: in VariableManager get_vars() 28983 1726883005.93476: done with get_vars() 28983 1726883005.93478: filtering new block on tags 28983 1726883005.93512: done filtering new block on tags 28983 1726883005.93515: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml for managed_node2 => (item=tasks/cleanup_profile+device.yml) 28983 1726883005.93521: extending task lists for all hosts with included blocks 28983 1726883005.95442: done extending task lists 28983 1726883005.95444: done processing included files 28983 1726883005.95445: results queue empty 28983 1726883005.95446: checking for any_errors_fatal 28983 1726883005.95451: done checking for any_errors_fatal 28983 1726883005.95452: checking for max_fail_percentage 28983 1726883005.95454: done checking for max_fail_percentage 28983 1726883005.95454: checking to see if all hosts have failed and the running result is not ok 28983 1726883005.95456: done checking to see if all hosts have failed 28983 1726883005.95456: getting the remaining hosts for this loop 28983 1726883005.95458: done getting the remaining hosts for this loop 28983 1726883005.95462: getting the next task for host managed_node2 28983 1726883005.95467: done getting next task for host managed_node2 28983 1726883005.95470: ^ task is: TASK: Cleanup profile and device 28983 1726883005.95473: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883005.95477: getting variables 28983 1726883005.95478: in VariableManager get_vars() 28983 1726883005.95490: Calling all_inventory to load vars for managed_node2 28983 1726883005.95492: Calling groups_inventory to load vars for managed_node2 28983 1726883005.95496: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883005.95502: Calling all_plugins_play to load vars for managed_node2 28983 1726883005.95506: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883005.95510: Calling groups_plugins_play to load vars for managed_node2 28983 1726883005.97623: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883006.00494: done with get_vars() 28983 1726883006.00529: done getting variables 28983 1726883006.00585: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Cleanup profile and device] ********************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml:3 Friday 20 September 2024 21:43:26 -0400 (0:00:00.149) 0:00:36.003 ****** 28983 1726883006.00620: entering _queue_task() for managed_node2/shell 28983 1726883006.00995: worker is 1 (out of 1 available) 28983 1726883006.01011: exiting _queue_task() for managed_node2/shell 28983 1726883006.01025: done queuing things up, now waiting for results queue to drain 28983 1726883006.01027: waiting for pending results... 28983 1726883006.01355: running TaskExecutor() for managed_node2/TASK: Cleanup profile and device 28983 1726883006.01515: in run() - task 0affe814-3a2d-b16d-c0a7-0000000009a0 28983 1726883006.01524: variable 'ansible_search_path' from source: unknown 28983 1726883006.01526: variable 'ansible_search_path' from source: unknown 28983 1726883006.01538: calling self._execute() 28983 1726883006.01653: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883006.01668: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883006.01691: variable 'omit' from source: magic vars 28983 1726883006.02170: variable 'ansible_distribution_major_version' from source: facts 28983 1726883006.02197: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883006.02281: variable 'omit' from source: magic vars 28983 1726883006.02287: variable 'omit' from source: magic vars 28983 1726883006.02499: variable 'interface' from source: play vars 28983 1726883006.02525: variable 'omit' from source: magic vars 28983 1726883006.02607: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883006.02639: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883006.02669: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883006.02716: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883006.02726: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883006.02769: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883006.02826: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883006.02833: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883006.02934: Set connection var ansible_connection to ssh 28983 1726883006.02959: Set connection var ansible_shell_executable to /bin/sh 28983 1726883006.02979: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883006.02994: Set connection var ansible_timeout to 10 28983 1726883006.03006: Set connection var ansible_pipelining to False 28983 1726883006.03041: Set connection var ansible_shell_type to sh 28983 1726883006.03058: variable 'ansible_shell_executable' from source: unknown 28983 1726883006.03068: variable 'ansible_connection' from source: unknown 28983 1726883006.03079: variable 'ansible_module_compression' from source: unknown 28983 1726883006.03140: variable 'ansible_shell_type' from source: unknown 28983 1726883006.03143: variable 'ansible_shell_executable' from source: unknown 28983 1726883006.03147: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883006.03149: variable 'ansible_pipelining' from source: unknown 28983 1726883006.03152: variable 'ansible_timeout' from source: unknown 28983 1726883006.03154: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883006.03314: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883006.03332: variable 'omit' from source: magic vars 28983 1726883006.03345: starting attempt loop 28983 1726883006.03354: running the handler 28983 1726883006.03369: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883006.03406: _low_level_execute_command(): starting 28983 1726883006.03492: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726883006.04335: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883006.04339: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883006.04342: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883006.04346: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found <<< 28983 1726883006.04349: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883006.04444: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883006.04520: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883006.06297: stdout chunk (state=3): >>>/root <<< 28983 1726883006.06405: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883006.06484: stderr chunk (state=3): >>><<< 28983 1726883006.06496: stdout chunk (state=3): >>><<< 28983 1726883006.06530: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883006.06579: _low_level_execute_command(): starting 28983 1726883006.06583: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726883006.0654273-30340-77391041528532 `" && echo ansible-tmp-1726883006.0654273-30340-77391041528532="` echo /root/.ansible/tmp/ansible-tmp-1726883006.0654273-30340-77391041528532 `" ) && sleep 0' 28983 1726883006.07201: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883006.07218: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883006.07288: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883006.07291: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883006.07363: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883006.09398: stdout chunk (state=3): >>>ansible-tmp-1726883006.0654273-30340-77391041528532=/root/.ansible/tmp/ansible-tmp-1726883006.0654273-30340-77391041528532 <<< 28983 1726883006.09524: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883006.09627: stderr chunk (state=3): >>><<< 28983 1726883006.09630: stdout chunk (state=3): >>><<< 28983 1726883006.09866: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726883006.0654273-30340-77391041528532=/root/.ansible/tmp/ansible-tmp-1726883006.0654273-30340-77391041528532 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883006.09870: variable 'ansible_module_compression' from source: unknown 28983 1726883006.09875: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 28983 1726883006.09878: variable 'ansible_facts' from source: unknown 28983 1726883006.09923: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726883006.0654273-30340-77391041528532/AnsiballZ_command.py 28983 1726883006.10126: Sending initial data 28983 1726883006.10139: Sent initial data (155 bytes) 28983 1726883006.10730: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883006.10749: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration <<< 28983 1726883006.10771: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883006.10813: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883006.10831: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883006.10901: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883006.12573: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 28983 1726883006.12577: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726883006.12637: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726883006.12710: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmp2h_802b4 /root/.ansible/tmp/ansible-tmp-1726883006.0654273-30340-77391041528532/AnsiballZ_command.py <<< 28983 1726883006.12712: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726883006.0654273-30340-77391041528532/AnsiballZ_command.py" <<< 28983 1726883006.12774: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmp2h_802b4" to remote "/root/.ansible/tmp/ansible-tmp-1726883006.0654273-30340-77391041528532/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726883006.0654273-30340-77391041528532/AnsiballZ_command.py" <<< 28983 1726883006.13950: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883006.13953: stderr chunk (state=3): >>><<< 28983 1726883006.13956: stdout chunk (state=3): >>><<< 28983 1726883006.13959: done transferring module to remote 28983 1726883006.13961: _low_level_execute_command(): starting 28983 1726883006.13963: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726883006.0654273-30340-77391041528532/ /root/.ansible/tmp/ansible-tmp-1726883006.0654273-30340-77391041528532/AnsiballZ_command.py && sleep 0' 28983 1726883006.14560: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883006.14563: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883006.14700: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883006.14709: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28983 1726883006.14731: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883006.14763: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883006.14870: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883006.16872: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883006.16876: stdout chunk (state=3): >>><<< 28983 1726883006.16878: stderr chunk (state=3): >>><<< 28983 1726883006.16996: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883006.16999: _low_level_execute_command(): starting 28983 1726883006.17002: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726883006.0654273-30340-77391041528532/AnsiballZ_command.py && sleep 0' 28983 1726883006.17594: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883006.17609: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883006.17633: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883006.17693: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883006.17786: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883006.17836: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883006.17921: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883006.39042: stdout chunk (state=3): >>> {"changed": true, "stdout": "Connection 'statebr' (1a065f38-1816-45bb-8e19-5c41e45c0397) successfully deleted.", "stderr": "Could not load file '/etc/sysconfig/network-scripts/ifcfg-statebr'\nCannot find device \"statebr\"", "rc": 1, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "start": "2024-09-20 21:43:26.351814", "end": "2024-09-20 21:43:26.389249", "delta": "0:00:00.037435", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 28983 1726883006.40941: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.46.139 closed. <<< 28983 1726883006.40945: stdout chunk (state=3): >>><<< 28983 1726883006.40948: stderr chunk (state=3): >>><<< 28983 1726883006.40951: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "Connection 'statebr' (1a065f38-1816-45bb-8e19-5c41e45c0397) successfully deleted.", "stderr": "Could not load file '/etc/sysconfig/network-scripts/ifcfg-statebr'\nCannot find device \"statebr\"", "rc": 1, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "start": "2024-09-20 21:43:26.351814", "end": "2024-09-20 21:43:26.389249", "delta": "0:00:00.037435", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.46.139 closed. 28983 1726883006.40964: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726883006.0654273-30340-77391041528532/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726883006.40984: _low_level_execute_command(): starting 28983 1726883006.41008: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726883006.0654273-30340-77391041528532/ > /dev/null 2>&1 && sleep 0' 28983 1726883006.41723: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883006.41752: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883006.41858: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883006.43810: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883006.43854: stderr chunk (state=3): >>><<< 28983 1726883006.43858: stdout chunk (state=3): >>><<< 28983 1726883006.43872: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883006.43885: handler run complete 28983 1726883006.43908: Evaluated conditional (False): False 28983 1726883006.43918: attempt loop complete, returning result 28983 1726883006.43921: _execute() done 28983 1726883006.43925: dumping result to json 28983 1726883006.43932: done dumping result, returning 28983 1726883006.43941: done running TaskExecutor() for managed_node2/TASK: Cleanup profile and device [0affe814-3a2d-b16d-c0a7-0000000009a0] 28983 1726883006.43947: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000009a0 28983 1726883006.44060: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000009a0 28983 1726883006.44063: WORKER PROCESS EXITING fatal: [managed_node2]: FAILED! => { "changed": false, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "delta": "0:00:00.037435", "end": "2024-09-20 21:43:26.389249", "rc": 1, "start": "2024-09-20 21:43:26.351814" } STDOUT: Connection 'statebr' (1a065f38-1816-45bb-8e19-5c41e45c0397) successfully deleted. STDERR: Could not load file '/etc/sysconfig/network-scripts/ifcfg-statebr' Cannot find device "statebr" MSG: non-zero return code ...ignoring 28983 1726883006.44140: no more pending results, returning what we have 28983 1726883006.44144: results queue empty 28983 1726883006.44145: checking for any_errors_fatal 28983 1726883006.44147: done checking for any_errors_fatal 28983 1726883006.44148: checking for max_fail_percentage 28983 1726883006.44150: done checking for max_fail_percentage 28983 1726883006.44151: checking to see if all hosts have failed and the running result is not ok 28983 1726883006.44152: done checking to see if all hosts have failed 28983 1726883006.44153: getting the remaining hosts for this loop 28983 1726883006.44155: done getting the remaining hosts for this loop 28983 1726883006.44160: getting the next task for host managed_node2 28983 1726883006.44171: done getting next task for host managed_node2 28983 1726883006.44175: ^ task is: TASK: Include the task 'run_test.yml' 28983 1726883006.44177: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883006.44181: getting variables 28983 1726883006.44183: in VariableManager get_vars() 28983 1726883006.44217: Calling all_inventory to load vars for managed_node2 28983 1726883006.44220: Calling groups_inventory to load vars for managed_node2 28983 1726883006.44223: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883006.44242: Calling all_plugins_play to load vars for managed_node2 28983 1726883006.44247: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883006.44251: Calling groups_plugins_play to load vars for managed_node2 28983 1726883006.45657: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883006.47265: done with get_vars() 28983 1726883006.47294: done getting variables TASK [Include the task 'run_test.yml'] ***************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_states.yml:65 Friday 20 September 2024 21:43:26 -0400 (0:00:00.467) 0:00:36.471 ****** 28983 1726883006.47371: entering _queue_task() for managed_node2/include_tasks 28983 1726883006.47608: worker is 1 (out of 1 available) 28983 1726883006.47623: exiting _queue_task() for managed_node2/include_tasks 28983 1726883006.47637: done queuing things up, now waiting for results queue to drain 28983 1726883006.47639: waiting for pending results... 28983 1726883006.47825: running TaskExecutor() for managed_node2/TASK: Include the task 'run_test.yml' 28983 1726883006.47894: in run() - task 0affe814-3a2d-b16d-c0a7-000000000011 28983 1726883006.47907: variable 'ansible_search_path' from source: unknown 28983 1726883006.47944: calling self._execute() 28983 1726883006.48027: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883006.48036: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883006.48047: variable 'omit' from source: magic vars 28983 1726883006.48370: variable 'ansible_distribution_major_version' from source: facts 28983 1726883006.48382: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883006.48388: _execute() done 28983 1726883006.48393: dumping result to json 28983 1726883006.48396: done dumping result, returning 28983 1726883006.48402: done running TaskExecutor() for managed_node2/TASK: Include the task 'run_test.yml' [0affe814-3a2d-b16d-c0a7-000000000011] 28983 1726883006.48408: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000011 28983 1726883006.48524: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000011 28983 1726883006.48527: WORKER PROCESS EXITING 28983 1726883006.48561: no more pending results, returning what we have 28983 1726883006.48566: in VariableManager get_vars() 28983 1726883006.48606: Calling all_inventory to load vars for managed_node2 28983 1726883006.48610: Calling groups_inventory to load vars for managed_node2 28983 1726883006.48613: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883006.48624: Calling all_plugins_play to load vars for managed_node2 28983 1726883006.48627: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883006.48631: Calling groups_plugins_play to load vars for managed_node2 28983 1726883006.49867: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883006.51474: done with get_vars() 28983 1726883006.51493: variable 'ansible_search_path' from source: unknown 28983 1726883006.51503: we have included files to process 28983 1726883006.51504: generating all_blocks data 28983 1726883006.51506: done generating all_blocks data 28983 1726883006.51511: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 28983 1726883006.51512: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 28983 1726883006.51514: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 28983 1726883006.51842: in VariableManager get_vars() 28983 1726883006.51858: done with get_vars() 28983 1726883006.51894: in VariableManager get_vars() 28983 1726883006.51907: done with get_vars() 28983 1726883006.51939: in VariableManager get_vars() 28983 1726883006.51952: done with get_vars() 28983 1726883006.51986: in VariableManager get_vars() 28983 1726883006.51999: done with get_vars() 28983 1726883006.52031: in VariableManager get_vars() 28983 1726883006.52044: done with get_vars() 28983 1726883006.52440: in VariableManager get_vars() 28983 1726883006.52454: done with get_vars() 28983 1726883006.52463: done processing included file 28983 1726883006.52465: iterating over new_blocks loaded from include file 28983 1726883006.52466: in VariableManager get_vars() 28983 1726883006.52475: done with get_vars() 28983 1726883006.52476: filtering new block on tags 28983 1726883006.52563: done filtering new block on tags 28983 1726883006.52566: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml for managed_node2 28983 1726883006.52570: extending task lists for all hosts with included blocks 28983 1726883006.52598: done extending task lists 28983 1726883006.52599: done processing included files 28983 1726883006.52599: results queue empty 28983 1726883006.52600: checking for any_errors_fatal 28983 1726883006.52605: done checking for any_errors_fatal 28983 1726883006.52605: checking for max_fail_percentage 28983 1726883006.52606: done checking for max_fail_percentage 28983 1726883006.52606: checking to see if all hosts have failed and the running result is not ok 28983 1726883006.52607: done checking to see if all hosts have failed 28983 1726883006.52608: getting the remaining hosts for this loop 28983 1726883006.52609: done getting the remaining hosts for this loop 28983 1726883006.52612: getting the next task for host managed_node2 28983 1726883006.52615: done getting next task for host managed_node2 28983 1726883006.52617: ^ task is: TASK: TEST: {{ lsr_description }} 28983 1726883006.52618: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883006.52620: getting variables 28983 1726883006.52621: in VariableManager get_vars() 28983 1726883006.52629: Calling all_inventory to load vars for managed_node2 28983 1726883006.52630: Calling groups_inventory to load vars for managed_node2 28983 1726883006.52632: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883006.52638: Calling all_plugins_play to load vars for managed_node2 28983 1726883006.52640: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883006.52642: Calling groups_plugins_play to load vars for managed_node2 28983 1726883006.53718: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883006.55302: done with get_vars() 28983 1726883006.55325: done getting variables 28983 1726883006.55360: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 28983 1726883006.55451: variable 'lsr_description' from source: include params TASK [TEST: I can activate an existing profile] ******************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:5 Friday 20 September 2024 21:43:26 -0400 (0:00:00.081) 0:00:36.552 ****** 28983 1726883006.55475: entering _queue_task() for managed_node2/debug 28983 1726883006.55710: worker is 1 (out of 1 available) 28983 1726883006.55726: exiting _queue_task() for managed_node2/debug 28983 1726883006.55741: done queuing things up, now waiting for results queue to drain 28983 1726883006.55743: waiting for pending results... 28983 1726883006.55937: running TaskExecutor() for managed_node2/TASK: TEST: I can activate an existing profile 28983 1726883006.56023: in run() - task 0affe814-3a2d-b16d-c0a7-000000000a49 28983 1726883006.56038: variable 'ansible_search_path' from source: unknown 28983 1726883006.56042: variable 'ansible_search_path' from source: unknown 28983 1726883006.56076: calling self._execute() 28983 1726883006.56159: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883006.56166: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883006.56179: variable 'omit' from source: magic vars 28983 1726883006.56492: variable 'ansible_distribution_major_version' from source: facts 28983 1726883006.56503: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883006.56509: variable 'omit' from source: magic vars 28983 1726883006.56549: variable 'omit' from source: magic vars 28983 1726883006.56633: variable 'lsr_description' from source: include params 28983 1726883006.56655: variable 'omit' from source: magic vars 28983 1726883006.56693: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883006.56724: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883006.56757: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883006.56768: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883006.56782: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883006.56809: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883006.56812: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883006.56817: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883006.56904: Set connection var ansible_connection to ssh 28983 1726883006.56915: Set connection var ansible_shell_executable to /bin/sh 28983 1726883006.56924: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883006.56932: Set connection var ansible_timeout to 10 28983 1726883006.56941: Set connection var ansible_pipelining to False 28983 1726883006.56944: Set connection var ansible_shell_type to sh 28983 1726883006.56969: variable 'ansible_shell_executable' from source: unknown 28983 1726883006.56978: variable 'ansible_connection' from source: unknown 28983 1726883006.56981: variable 'ansible_module_compression' from source: unknown 28983 1726883006.56983: variable 'ansible_shell_type' from source: unknown 28983 1726883006.56986: variable 'ansible_shell_executable' from source: unknown 28983 1726883006.56988: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883006.56991: variable 'ansible_pipelining' from source: unknown 28983 1726883006.56994: variable 'ansible_timeout' from source: unknown 28983 1726883006.57000: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883006.57117: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883006.57127: variable 'omit' from source: magic vars 28983 1726883006.57133: starting attempt loop 28983 1726883006.57138: running the handler 28983 1726883006.57182: handler run complete 28983 1726883006.57198: attempt loop complete, returning result 28983 1726883006.57201: _execute() done 28983 1726883006.57204: dumping result to json 28983 1726883006.57209: done dumping result, returning 28983 1726883006.57216: done running TaskExecutor() for managed_node2/TASK: TEST: I can activate an existing profile [0affe814-3a2d-b16d-c0a7-000000000a49] 28983 1726883006.57224: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000a49 28983 1726883006.57316: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000a49 28983 1726883006.57319: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: ########## I can activate an existing profile ########## 28983 1726883006.57379: no more pending results, returning what we have 28983 1726883006.57382: results queue empty 28983 1726883006.57384: checking for any_errors_fatal 28983 1726883006.57386: done checking for any_errors_fatal 28983 1726883006.57386: checking for max_fail_percentage 28983 1726883006.57388: done checking for max_fail_percentage 28983 1726883006.57389: checking to see if all hosts have failed and the running result is not ok 28983 1726883006.57390: done checking to see if all hosts have failed 28983 1726883006.57391: getting the remaining hosts for this loop 28983 1726883006.57393: done getting the remaining hosts for this loop 28983 1726883006.57397: getting the next task for host managed_node2 28983 1726883006.57403: done getting next task for host managed_node2 28983 1726883006.57407: ^ task is: TASK: Show item 28983 1726883006.57410: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883006.57413: getting variables 28983 1726883006.57415: in VariableManager get_vars() 28983 1726883006.57447: Calling all_inventory to load vars for managed_node2 28983 1726883006.57450: Calling groups_inventory to load vars for managed_node2 28983 1726883006.57454: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883006.57463: Calling all_plugins_play to load vars for managed_node2 28983 1726883006.57466: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883006.57469: Calling groups_plugins_play to load vars for managed_node2 28983 1726883006.58775: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883006.60390: done with get_vars() 28983 1726883006.60411: done getting variables 28983 1726883006.60458: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show item] *************************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:9 Friday 20 September 2024 21:43:26 -0400 (0:00:00.050) 0:00:36.602 ****** 28983 1726883006.60485: entering _queue_task() for managed_node2/debug 28983 1726883006.60705: worker is 1 (out of 1 available) 28983 1726883006.60719: exiting _queue_task() for managed_node2/debug 28983 1726883006.60732: done queuing things up, now waiting for results queue to drain 28983 1726883006.60736: waiting for pending results... 28983 1726883006.60912: running TaskExecutor() for managed_node2/TASK: Show item 28983 1726883006.60991: in run() - task 0affe814-3a2d-b16d-c0a7-000000000a4a 28983 1726883006.61003: variable 'ansible_search_path' from source: unknown 28983 1726883006.61008: variable 'ansible_search_path' from source: unknown 28983 1726883006.61052: variable 'omit' from source: magic vars 28983 1726883006.61178: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883006.61195: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883006.61200: variable 'omit' from source: magic vars 28983 1726883006.61493: variable 'ansible_distribution_major_version' from source: facts 28983 1726883006.61503: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883006.61514: variable 'omit' from source: magic vars 28983 1726883006.61546: variable 'omit' from source: magic vars 28983 1726883006.61583: variable 'item' from source: unknown 28983 1726883006.61644: variable 'item' from source: unknown 28983 1726883006.61658: variable 'omit' from source: magic vars 28983 1726883006.61695: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883006.61728: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883006.61748: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883006.61764: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883006.61777: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883006.61802: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883006.61806: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883006.61811: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883006.61893: Set connection var ansible_connection to ssh 28983 1726883006.61903: Set connection var ansible_shell_executable to /bin/sh 28983 1726883006.61912: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883006.61920: Set connection var ansible_timeout to 10 28983 1726883006.61926: Set connection var ansible_pipelining to False 28983 1726883006.61929: Set connection var ansible_shell_type to sh 28983 1726883006.61953: variable 'ansible_shell_executable' from source: unknown 28983 1726883006.61958: variable 'ansible_connection' from source: unknown 28983 1726883006.61960: variable 'ansible_module_compression' from source: unknown 28983 1726883006.61963: variable 'ansible_shell_type' from source: unknown 28983 1726883006.61965: variable 'ansible_shell_executable' from source: unknown 28983 1726883006.61967: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883006.61970: variable 'ansible_pipelining' from source: unknown 28983 1726883006.61974: variable 'ansible_timeout' from source: unknown 28983 1726883006.61977: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883006.62097: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883006.62104: variable 'omit' from source: magic vars 28983 1726883006.62110: starting attempt loop 28983 1726883006.62113: running the handler 28983 1726883006.62154: variable 'lsr_description' from source: include params 28983 1726883006.62210: variable 'lsr_description' from source: include params 28983 1726883006.62221: handler run complete 28983 1726883006.62238: attempt loop complete, returning result 28983 1726883006.62252: variable 'item' from source: unknown 28983 1726883006.62314: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_description) => { "ansible_loop_var": "item", "item": "lsr_description", "lsr_description": "I can activate an existing profile" } 28983 1726883006.62466: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883006.62469: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883006.62475: variable 'omit' from source: magic vars 28983 1726883006.62575: variable 'ansible_distribution_major_version' from source: facts 28983 1726883006.62578: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883006.62581: variable 'omit' from source: magic vars 28983 1726883006.62601: variable 'omit' from source: magic vars 28983 1726883006.62632: variable 'item' from source: unknown 28983 1726883006.62685: variable 'item' from source: unknown 28983 1726883006.62706: variable 'omit' from source: magic vars 28983 1726883006.62718: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883006.62725: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883006.62732: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883006.62745: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883006.62748: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883006.62753: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883006.62807: Set connection var ansible_connection to ssh 28983 1726883006.62823: Set connection var ansible_shell_executable to /bin/sh 28983 1726883006.62832: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883006.62843: Set connection var ansible_timeout to 10 28983 1726883006.62849: Set connection var ansible_pipelining to False 28983 1726883006.62852: Set connection var ansible_shell_type to sh 28983 1726883006.62868: variable 'ansible_shell_executable' from source: unknown 28983 1726883006.62871: variable 'ansible_connection' from source: unknown 28983 1726883006.62877: variable 'ansible_module_compression' from source: unknown 28983 1726883006.62879: variable 'ansible_shell_type' from source: unknown 28983 1726883006.62883: variable 'ansible_shell_executable' from source: unknown 28983 1726883006.62885: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883006.62890: variable 'ansible_pipelining' from source: unknown 28983 1726883006.62893: variable 'ansible_timeout' from source: unknown 28983 1726883006.62899: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883006.62976: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883006.62983: variable 'omit' from source: magic vars 28983 1726883006.62988: starting attempt loop 28983 1726883006.62991: running the handler 28983 1726883006.63010: variable 'lsr_setup' from source: include params 28983 1726883006.63077: variable 'lsr_setup' from source: include params 28983 1726883006.63111: handler run complete 28983 1726883006.63123: attempt loop complete, returning result 28983 1726883006.63142: variable 'item' from source: unknown 28983 1726883006.63192: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_setup) => { "ansible_loop_var": "item", "item": "lsr_setup", "lsr_setup": [ "tasks/create_bridge_profile.yml" ] } 28983 1726883006.63288: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883006.63291: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883006.63304: variable 'omit' from source: magic vars 28983 1726883006.63428: variable 'ansible_distribution_major_version' from source: facts 28983 1726883006.63435: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883006.63440: variable 'omit' from source: magic vars 28983 1726883006.63453: variable 'omit' from source: magic vars 28983 1726883006.63487: variable 'item' from source: unknown 28983 1726883006.63543: variable 'item' from source: unknown 28983 1726883006.63555: variable 'omit' from source: magic vars 28983 1726883006.63570: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883006.63578: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883006.63584: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883006.63594: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883006.63597: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883006.63602: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883006.63662: Set connection var ansible_connection to ssh 28983 1726883006.63673: Set connection var ansible_shell_executable to /bin/sh 28983 1726883006.63681: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883006.63690: Set connection var ansible_timeout to 10 28983 1726883006.63696: Set connection var ansible_pipelining to False 28983 1726883006.63699: Set connection var ansible_shell_type to sh 28983 1726883006.63715: variable 'ansible_shell_executable' from source: unknown 28983 1726883006.63717: variable 'ansible_connection' from source: unknown 28983 1726883006.63721: variable 'ansible_module_compression' from source: unknown 28983 1726883006.63725: variable 'ansible_shell_type' from source: unknown 28983 1726883006.63730: variable 'ansible_shell_executable' from source: unknown 28983 1726883006.63732: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883006.63747: variable 'ansible_pipelining' from source: unknown 28983 1726883006.63750: variable 'ansible_timeout' from source: unknown 28983 1726883006.63752: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883006.63816: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883006.63823: variable 'omit' from source: magic vars 28983 1726883006.63828: starting attempt loop 28983 1726883006.63831: running the handler 28983 1726883006.63859: variable 'lsr_test' from source: include params 28983 1726883006.63906: variable 'lsr_test' from source: include params 28983 1726883006.63920: handler run complete 28983 1726883006.63933: attempt loop complete, returning result 28983 1726883006.63948: variable 'item' from source: unknown 28983 1726883006.64002: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_test) => { "ansible_loop_var": "item", "item": "lsr_test", "lsr_test": [ "tasks/activate_profile.yml" ] } 28983 1726883006.64088: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883006.64102: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883006.64109: variable 'omit' from source: magic vars 28983 1726883006.64238: variable 'ansible_distribution_major_version' from source: facts 28983 1726883006.64244: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883006.64248: variable 'omit' from source: magic vars 28983 1726883006.64260: variable 'omit' from source: magic vars 28983 1726883006.64294: variable 'item' from source: unknown 28983 1726883006.64348: variable 'item' from source: unknown 28983 1726883006.64361: variable 'omit' from source: magic vars 28983 1726883006.64431: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883006.64435: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883006.64443: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883006.64446: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883006.64448: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883006.64451: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883006.64463: Set connection var ansible_connection to ssh 28983 1726883006.64475: Set connection var ansible_shell_executable to /bin/sh 28983 1726883006.64482: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883006.64491: Set connection var ansible_timeout to 10 28983 1726883006.64497: Set connection var ansible_pipelining to False 28983 1726883006.64500: Set connection var ansible_shell_type to sh 28983 1726883006.64516: variable 'ansible_shell_executable' from source: unknown 28983 1726883006.64519: variable 'ansible_connection' from source: unknown 28983 1726883006.64524: variable 'ansible_module_compression' from source: unknown 28983 1726883006.64528: variable 'ansible_shell_type' from source: unknown 28983 1726883006.64531: variable 'ansible_shell_executable' from source: unknown 28983 1726883006.64543: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883006.64546: variable 'ansible_pipelining' from source: unknown 28983 1726883006.64552: variable 'ansible_timeout' from source: unknown 28983 1726883006.64555: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883006.64619: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883006.64626: variable 'omit' from source: magic vars 28983 1726883006.64631: starting attempt loop 28983 1726883006.64635: running the handler 28983 1726883006.64660: variable 'lsr_assert' from source: include params 28983 1726883006.64706: variable 'lsr_assert' from source: include params 28983 1726883006.64721: handler run complete 28983 1726883006.64736: attempt loop complete, returning result 28983 1726883006.64749: variable 'item' from source: unknown 28983 1726883006.64802: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_assert) => { "ansible_loop_var": "item", "item": "lsr_assert", "lsr_assert": [ "tasks/assert_device_present.yml", "tasks/assert_profile_present.yml" ] } 28983 1726883006.64887: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883006.64901: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883006.64909: variable 'omit' from source: magic vars 28983 1726883006.65068: variable 'ansible_distribution_major_version' from source: facts 28983 1726883006.65076: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883006.65083: variable 'omit' from source: magic vars 28983 1726883006.65093: variable 'omit' from source: magic vars 28983 1726883006.65133: variable 'item' from source: unknown 28983 1726883006.65184: variable 'item' from source: unknown 28983 1726883006.65197: variable 'omit' from source: magic vars 28983 1726883006.65212: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883006.65227: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883006.65230: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883006.65242: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883006.65245: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883006.65250: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883006.65304: Set connection var ansible_connection to ssh 28983 1726883006.65313: Set connection var ansible_shell_executable to /bin/sh 28983 1726883006.65321: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883006.65335: Set connection var ansible_timeout to 10 28983 1726883006.65342: Set connection var ansible_pipelining to False 28983 1726883006.65345: Set connection var ansible_shell_type to sh 28983 1726883006.65361: variable 'ansible_shell_executable' from source: unknown 28983 1726883006.65364: variable 'ansible_connection' from source: unknown 28983 1726883006.65368: variable 'ansible_module_compression' from source: unknown 28983 1726883006.65375: variable 'ansible_shell_type' from source: unknown 28983 1726883006.65378: variable 'ansible_shell_executable' from source: unknown 28983 1726883006.65380: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883006.65385: variable 'ansible_pipelining' from source: unknown 28983 1726883006.65389: variable 'ansible_timeout' from source: unknown 28983 1726883006.65394: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883006.65469: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883006.65541: variable 'omit' from source: magic vars 28983 1726883006.65544: starting attempt loop 28983 1726883006.65552: running the handler 28983 1726883006.65568: handler run complete 28983 1726883006.65581: attempt loop complete, returning result 28983 1726883006.65594: variable 'item' from source: unknown 28983 1726883006.65646: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_assert_when) => { "ansible_loop_var": "item", "item": "lsr_assert_when", "lsr_assert_when": "VARIABLE IS NOT DEFINED!: 'lsr_assert_when' is undefined" } 28983 1726883006.65735: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883006.65741: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883006.65751: variable 'omit' from source: magic vars 28983 1726883006.65877: variable 'ansible_distribution_major_version' from source: facts 28983 1726883006.65881: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883006.65883: variable 'omit' from source: magic vars 28983 1726883006.65897: variable 'omit' from source: magic vars 28983 1726883006.65929: variable 'item' from source: unknown 28983 1726883006.65984: variable 'item' from source: unknown 28983 1726883006.65996: variable 'omit' from source: magic vars 28983 1726883006.66015: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883006.66022: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883006.66028: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883006.66041: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883006.66044: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883006.66050: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883006.66111: Set connection var ansible_connection to ssh 28983 1726883006.66120: Set connection var ansible_shell_executable to /bin/sh 28983 1726883006.66129: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883006.66139: Set connection var ansible_timeout to 10 28983 1726883006.66145: Set connection var ansible_pipelining to False 28983 1726883006.66148: Set connection var ansible_shell_type to sh 28983 1726883006.66165: variable 'ansible_shell_executable' from source: unknown 28983 1726883006.66179: variable 'ansible_connection' from source: unknown 28983 1726883006.66182: variable 'ansible_module_compression' from source: unknown 28983 1726883006.66185: variable 'ansible_shell_type' from source: unknown 28983 1726883006.66187: variable 'ansible_shell_executable' from source: unknown 28983 1726883006.66189: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883006.66192: variable 'ansible_pipelining' from source: unknown 28983 1726883006.66197: variable 'ansible_timeout' from source: unknown 28983 1726883006.66203: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883006.66280: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883006.66289: variable 'omit' from source: magic vars 28983 1726883006.66292: starting attempt loop 28983 1726883006.66297: running the handler 28983 1726883006.66313: variable 'lsr_fail_debug' from source: play vars 28983 1726883006.66367: variable 'lsr_fail_debug' from source: play vars 28983 1726883006.66385: handler run complete 28983 1726883006.66435: attempt loop complete, returning result 28983 1726883006.66440: variable 'item' from source: unknown 28983 1726883006.66461: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_fail_debug) => { "ansible_loop_var": "item", "item": "lsr_fail_debug", "lsr_fail_debug": [ "__network_connections_result" ] } 28983 1726883006.66544: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883006.66557: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883006.66566: variable 'omit' from source: magic vars 28983 1726883006.66689: variable 'ansible_distribution_major_version' from source: facts 28983 1726883006.66694: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883006.66699: variable 'omit' from source: magic vars 28983 1726883006.66712: variable 'omit' from source: magic vars 28983 1726883006.66745: variable 'item' from source: unknown 28983 1726883006.66799: variable 'item' from source: unknown 28983 1726883006.66811: variable 'omit' from source: magic vars 28983 1726883006.66827: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883006.66835: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883006.66843: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883006.66855: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883006.66858: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883006.66863: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883006.66922: Set connection var ansible_connection to ssh 28983 1726883006.66931: Set connection var ansible_shell_executable to /bin/sh 28983 1726883006.66941: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883006.66950: Set connection var ansible_timeout to 10 28983 1726883006.66956: Set connection var ansible_pipelining to False 28983 1726883006.66958: Set connection var ansible_shell_type to sh 28983 1726883006.66977: variable 'ansible_shell_executable' from source: unknown 28983 1726883006.66980: variable 'ansible_connection' from source: unknown 28983 1726883006.66987: variable 'ansible_module_compression' from source: unknown 28983 1726883006.66990: variable 'ansible_shell_type' from source: unknown 28983 1726883006.66992: variable 'ansible_shell_executable' from source: unknown 28983 1726883006.66996: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883006.66998: variable 'ansible_pipelining' from source: unknown 28983 1726883006.67001: variable 'ansible_timeout' from source: unknown 28983 1726883006.67003: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883006.67076: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883006.67079: variable 'omit' from source: magic vars 28983 1726883006.67086: starting attempt loop 28983 1726883006.67088: running the handler 28983 1726883006.67106: variable 'lsr_cleanup' from source: include params 28983 1726883006.67161: variable 'lsr_cleanup' from source: include params 28983 1726883006.67177: handler run complete 28983 1726883006.67188: attempt loop complete, returning result 28983 1726883006.67202: variable 'item' from source: unknown 28983 1726883006.67255: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_cleanup) => { "ansible_loop_var": "item", "item": "lsr_cleanup", "lsr_cleanup": [ "tasks/cleanup_profile+device.yml" ] } 28983 1726883006.67337: dumping result to json 28983 1726883006.67342: done dumping result, returning 28983 1726883006.67345: done running TaskExecutor() for managed_node2/TASK: Show item [0affe814-3a2d-b16d-c0a7-000000000a4a] 28983 1726883006.67347: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000a4a 28983 1726883006.67397: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000a4a 28983 1726883006.67400: WORKER PROCESS EXITING 28983 1726883006.67469: no more pending results, returning what we have 28983 1726883006.67476: results queue empty 28983 1726883006.67477: checking for any_errors_fatal 28983 1726883006.67483: done checking for any_errors_fatal 28983 1726883006.67484: checking for max_fail_percentage 28983 1726883006.67486: done checking for max_fail_percentage 28983 1726883006.67487: checking to see if all hosts have failed and the running result is not ok 28983 1726883006.67488: done checking to see if all hosts have failed 28983 1726883006.67489: getting the remaining hosts for this loop 28983 1726883006.67491: done getting the remaining hosts for this loop 28983 1726883006.67495: getting the next task for host managed_node2 28983 1726883006.67501: done getting next task for host managed_node2 28983 1726883006.67505: ^ task is: TASK: Include the task 'show_interfaces.yml' 28983 1726883006.67508: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883006.67513: getting variables 28983 1726883006.67515: in VariableManager get_vars() 28983 1726883006.67548: Calling all_inventory to load vars for managed_node2 28983 1726883006.67551: Calling groups_inventory to load vars for managed_node2 28983 1726883006.67555: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883006.67565: Calling all_plugins_play to load vars for managed_node2 28983 1726883006.67568: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883006.67574: Calling groups_plugins_play to load vars for managed_node2 28983 1726883006.68829: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883006.70535: done with get_vars() 28983 1726883006.70558: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:21 Friday 20 September 2024 21:43:26 -0400 (0:00:00.101) 0:00:36.704 ****** 28983 1726883006.70643: entering _queue_task() for managed_node2/include_tasks 28983 1726883006.70907: worker is 1 (out of 1 available) 28983 1726883006.70923: exiting _queue_task() for managed_node2/include_tasks 28983 1726883006.70939: done queuing things up, now waiting for results queue to drain 28983 1726883006.70941: waiting for pending results... 28983 1726883006.71129: running TaskExecutor() for managed_node2/TASK: Include the task 'show_interfaces.yml' 28983 1726883006.71210: in run() - task 0affe814-3a2d-b16d-c0a7-000000000a4b 28983 1726883006.71223: variable 'ansible_search_path' from source: unknown 28983 1726883006.71227: variable 'ansible_search_path' from source: unknown 28983 1726883006.71260: calling self._execute() 28983 1726883006.71347: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883006.71353: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883006.71364: variable 'omit' from source: magic vars 28983 1726883006.71685: variable 'ansible_distribution_major_version' from source: facts 28983 1726883006.71695: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883006.71702: _execute() done 28983 1726883006.71705: dumping result to json 28983 1726883006.71716: done dumping result, returning 28983 1726883006.71720: done running TaskExecutor() for managed_node2/TASK: Include the task 'show_interfaces.yml' [0affe814-3a2d-b16d-c0a7-000000000a4b] 28983 1726883006.71723: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000a4b 28983 1726883006.71822: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000a4b 28983 1726883006.71825: WORKER PROCESS EXITING 28983 1726883006.71858: no more pending results, returning what we have 28983 1726883006.71863: in VariableManager get_vars() 28983 1726883006.71906: Calling all_inventory to load vars for managed_node2 28983 1726883006.71910: Calling groups_inventory to load vars for managed_node2 28983 1726883006.71914: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883006.71924: Calling all_plugins_play to load vars for managed_node2 28983 1726883006.71927: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883006.71931: Calling groups_plugins_play to load vars for managed_node2 28983 1726883006.73478: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883006.76246: done with get_vars() 28983 1726883006.76287: variable 'ansible_search_path' from source: unknown 28983 1726883006.76288: variable 'ansible_search_path' from source: unknown 28983 1726883006.76344: we have included files to process 28983 1726883006.76346: generating all_blocks data 28983 1726883006.76348: done generating all_blocks data 28983 1726883006.76354: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 28983 1726883006.76356: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 28983 1726883006.76359: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 28983 1726883006.76492: in VariableManager get_vars() 28983 1726883006.76518: done with get_vars() 28983 1726883006.76658: done processing included file 28983 1726883006.76660: iterating over new_blocks loaded from include file 28983 1726883006.76662: in VariableManager get_vars() 28983 1726883006.76679: done with get_vars() 28983 1726883006.76682: filtering new block on tags 28983 1726883006.76725: done filtering new block on tags 28983 1726883006.76729: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node2 28983 1726883006.76737: extending task lists for all hosts with included blocks 28983 1726883006.77216: done extending task lists 28983 1726883006.77217: done processing included files 28983 1726883006.77218: results queue empty 28983 1726883006.77219: checking for any_errors_fatal 28983 1726883006.77225: done checking for any_errors_fatal 28983 1726883006.77225: checking for max_fail_percentage 28983 1726883006.77226: done checking for max_fail_percentage 28983 1726883006.77227: checking to see if all hosts have failed and the running result is not ok 28983 1726883006.77227: done checking to see if all hosts have failed 28983 1726883006.77228: getting the remaining hosts for this loop 28983 1726883006.77229: done getting the remaining hosts for this loop 28983 1726883006.77231: getting the next task for host managed_node2 28983 1726883006.77237: done getting next task for host managed_node2 28983 1726883006.77239: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 28983 1726883006.77242: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883006.77244: getting variables 28983 1726883006.77245: in VariableManager get_vars() 28983 1726883006.77253: Calling all_inventory to load vars for managed_node2 28983 1726883006.77256: Calling groups_inventory to load vars for managed_node2 28983 1726883006.77258: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883006.77264: Calling all_plugins_play to load vars for managed_node2 28983 1726883006.77267: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883006.77271: Calling groups_plugins_play to load vars for managed_node2 28983 1726883006.78445: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883006.80372: done with get_vars() 28983 1726883006.80405: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 21:43:26 -0400 (0:00:00.098) 0:00:36.802 ****** 28983 1726883006.80496: entering _queue_task() for managed_node2/include_tasks 28983 1726883006.80875: worker is 1 (out of 1 available) 28983 1726883006.80890: exiting _queue_task() for managed_node2/include_tasks 28983 1726883006.80905: done queuing things up, now waiting for results queue to drain 28983 1726883006.80907: waiting for pending results... 28983 1726883006.81167: running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' 28983 1726883006.81277: in run() - task 0affe814-3a2d-b16d-c0a7-000000000a72 28983 1726883006.81293: variable 'ansible_search_path' from source: unknown 28983 1726883006.81297: variable 'ansible_search_path' from source: unknown 28983 1726883006.81335: calling self._execute() 28983 1726883006.81418: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883006.81424: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883006.81436: variable 'omit' from source: magic vars 28983 1726883006.81776: variable 'ansible_distribution_major_version' from source: facts 28983 1726883006.81789: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883006.81792: _execute() done 28983 1726883006.81795: dumping result to json 28983 1726883006.81799: done dumping result, returning 28983 1726883006.81808: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' [0affe814-3a2d-b16d-c0a7-000000000a72] 28983 1726883006.81812: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000a72 28983 1726883006.81905: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000a72 28983 1726883006.81910: WORKER PROCESS EXITING 28983 1726883006.81950: no more pending results, returning what we have 28983 1726883006.81955: in VariableManager get_vars() 28983 1726883006.81999: Calling all_inventory to load vars for managed_node2 28983 1726883006.82003: Calling groups_inventory to load vars for managed_node2 28983 1726883006.82007: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883006.82028: Calling all_plugins_play to load vars for managed_node2 28983 1726883006.82035: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883006.82039: Calling groups_plugins_play to load vars for managed_node2 28983 1726883006.88469: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883006.91269: done with get_vars() 28983 1726883006.91326: variable 'ansible_search_path' from source: unknown 28983 1726883006.91328: variable 'ansible_search_path' from source: unknown 28983 1726883006.91380: we have included files to process 28983 1726883006.91381: generating all_blocks data 28983 1726883006.91383: done generating all_blocks data 28983 1726883006.91385: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 28983 1726883006.91386: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 28983 1726883006.91389: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 28983 1726883006.91707: done processing included file 28983 1726883006.91710: iterating over new_blocks loaded from include file 28983 1726883006.91712: in VariableManager get_vars() 28983 1726883006.91736: done with get_vars() 28983 1726883006.91739: filtering new block on tags 28983 1726883006.91791: done filtering new block on tags 28983 1726883006.91794: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node2 28983 1726883006.91799: extending task lists for all hosts with included blocks 28983 1726883006.92028: done extending task lists 28983 1726883006.92030: done processing included files 28983 1726883006.92031: results queue empty 28983 1726883006.92032: checking for any_errors_fatal 28983 1726883006.92037: done checking for any_errors_fatal 28983 1726883006.92038: checking for max_fail_percentage 28983 1726883006.92039: done checking for max_fail_percentage 28983 1726883006.92040: checking to see if all hosts have failed and the running result is not ok 28983 1726883006.92041: done checking to see if all hosts have failed 28983 1726883006.92042: getting the remaining hosts for this loop 28983 1726883006.92043: done getting the remaining hosts for this loop 28983 1726883006.92046: getting the next task for host managed_node2 28983 1726883006.92052: done getting next task for host managed_node2 28983 1726883006.92054: ^ task is: TASK: Gather current interface info 28983 1726883006.92058: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883006.92060: getting variables 28983 1726883006.92061: in VariableManager get_vars() 28983 1726883006.92075: Calling all_inventory to load vars for managed_node2 28983 1726883006.92077: Calling groups_inventory to load vars for managed_node2 28983 1726883006.92081: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883006.92087: Calling all_plugins_play to load vars for managed_node2 28983 1726883006.92090: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883006.92094: Calling groups_plugins_play to load vars for managed_node2 28983 1726883006.94049: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883006.96435: done with get_vars() 28983 1726883006.96463: done getting variables 28983 1726883006.96506: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 21:43:26 -0400 (0:00:00.160) 0:00:36.963 ****** 28983 1726883006.96532: entering _queue_task() for managed_node2/command 28983 1726883006.96816: worker is 1 (out of 1 available) 28983 1726883006.96831: exiting _queue_task() for managed_node2/command 28983 1726883006.96848: done queuing things up, now waiting for results queue to drain 28983 1726883006.96850: waiting for pending results... 28983 1726883006.97085: running TaskExecutor() for managed_node2/TASK: Gather current interface info 28983 1726883006.97183: in run() - task 0affe814-3a2d-b16d-c0a7-000000000aad 28983 1726883006.97197: variable 'ansible_search_path' from source: unknown 28983 1726883006.97201: variable 'ansible_search_path' from source: unknown 28983 1726883006.97241: calling self._execute() 28983 1726883006.97440: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883006.97445: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883006.97449: variable 'omit' from source: magic vars 28983 1726883006.97750: variable 'ansible_distribution_major_version' from source: facts 28983 1726883006.97760: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883006.97772: variable 'omit' from source: magic vars 28983 1726883006.97818: variable 'omit' from source: magic vars 28983 1726883006.97847: variable 'omit' from source: magic vars 28983 1726883006.97891: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883006.97922: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883006.97941: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883006.97958: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883006.97968: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883006.98001: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883006.98005: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883006.98011: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883006.98102: Set connection var ansible_connection to ssh 28983 1726883006.98110: Set connection var ansible_shell_executable to /bin/sh 28983 1726883006.98119: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883006.98128: Set connection var ansible_timeout to 10 28983 1726883006.98136: Set connection var ansible_pipelining to False 28983 1726883006.98138: Set connection var ansible_shell_type to sh 28983 1726883006.98159: variable 'ansible_shell_executable' from source: unknown 28983 1726883006.98162: variable 'ansible_connection' from source: unknown 28983 1726883006.98165: variable 'ansible_module_compression' from source: unknown 28983 1726883006.98170: variable 'ansible_shell_type' from source: unknown 28983 1726883006.98172: variable 'ansible_shell_executable' from source: unknown 28983 1726883006.98179: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883006.98239: variable 'ansible_pipelining' from source: unknown 28983 1726883006.98243: variable 'ansible_timeout' from source: unknown 28983 1726883006.98246: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883006.98361: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883006.98383: variable 'omit' from source: magic vars 28983 1726883006.98393: starting attempt loop 28983 1726883006.98400: running the handler 28983 1726883006.98418: _low_level_execute_command(): starting 28983 1726883006.98430: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726883006.99063: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883006.99084: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883006.99102: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883006.99163: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883006.99170: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883006.99173: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883006.99250: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883007.01020: stdout chunk (state=3): >>>/root <<< 28983 1726883007.01129: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883007.01193: stderr chunk (state=3): >>><<< 28983 1726883007.01195: stdout chunk (state=3): >>><<< 28983 1726883007.01211: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883007.01241: _low_level_execute_command(): starting 28983 1726883007.01246: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726883007.012168-30382-103809070331442 `" && echo ansible-tmp-1726883007.012168-30382-103809070331442="` echo /root/.ansible/tmp/ansible-tmp-1726883007.012168-30382-103809070331442 `" ) && sleep 0' 28983 1726883007.01682: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883007.01686: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883007.01688: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883007.01697: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883007.01746: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883007.01753: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883007.01825: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883007.03843: stdout chunk (state=3): >>>ansible-tmp-1726883007.012168-30382-103809070331442=/root/.ansible/tmp/ansible-tmp-1726883007.012168-30382-103809070331442 <<< 28983 1726883007.03965: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883007.04012: stderr chunk (state=3): >>><<< 28983 1726883007.04016: stdout chunk (state=3): >>><<< 28983 1726883007.04029: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726883007.012168-30382-103809070331442=/root/.ansible/tmp/ansible-tmp-1726883007.012168-30382-103809070331442 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883007.04056: variable 'ansible_module_compression' from source: unknown 28983 1726883007.04106: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 28983 1726883007.04139: variable 'ansible_facts' from source: unknown 28983 1726883007.04208: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726883007.012168-30382-103809070331442/AnsiballZ_command.py 28983 1726883007.04313: Sending initial data 28983 1726883007.04317: Sent initial data (155 bytes) 28983 1726883007.04870: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883007.04896: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883007.04995: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883007.06650: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 28983 1726883007.06662: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726883007.06719: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726883007.06798: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpwkkicmuv /root/.ansible/tmp/ansible-tmp-1726883007.012168-30382-103809070331442/AnsiballZ_command.py <<< 28983 1726883007.06803: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726883007.012168-30382-103809070331442/AnsiballZ_command.py" <<< 28983 1726883007.06859: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpwkkicmuv" to remote "/root/.ansible/tmp/ansible-tmp-1726883007.012168-30382-103809070331442/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726883007.012168-30382-103809070331442/AnsiballZ_command.py" <<< 28983 1726883007.07763: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883007.07827: stderr chunk (state=3): >>><<< 28983 1726883007.07830: stdout chunk (state=3): >>><<< 28983 1726883007.07850: done transferring module to remote 28983 1726883007.07860: _low_level_execute_command(): starting 28983 1726883007.07865: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726883007.012168-30382-103809070331442/ /root/.ansible/tmp/ansible-tmp-1726883007.012168-30382-103809070331442/AnsiballZ_command.py && sleep 0' 28983 1726883007.08317: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883007.08323: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726883007.08326: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration <<< 28983 1726883007.08328: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found <<< 28983 1726883007.08332: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883007.08382: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883007.08386: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883007.08464: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883007.10402: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883007.10444: stderr chunk (state=3): >>><<< 28983 1726883007.10448: stdout chunk (state=3): >>><<< 28983 1726883007.10462: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883007.10465: _low_level_execute_command(): starting 28983 1726883007.10471: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726883007.012168-30382-103809070331442/AnsiballZ_command.py && sleep 0' 28983 1726883007.10905: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883007.10908: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883007.10911: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883007.10913: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883007.10974: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883007.10977: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883007.11053: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883007.28675: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:43:27.281890", "end": "2024-09-20 21:43:27.285554", "delta": "0:00:00.003664", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 28983 1726883007.30296: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. <<< 28983 1726883007.30353: stderr chunk (state=3): >>><<< 28983 1726883007.30356: stdout chunk (state=3): >>><<< 28983 1726883007.30377: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:43:27.281890", "end": "2024-09-20 21:43:27.285554", "delta": "0:00:00.003664", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726883007.30414: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726883007.012168-30382-103809070331442/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726883007.30424: _low_level_execute_command(): starting 28983 1726883007.30430: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726883007.012168-30382-103809070331442/ > /dev/null 2>&1 && sleep 0' 28983 1726883007.30901: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883007.30904: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726883007.30907: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883007.30909: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883007.30911: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883007.30967: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883007.30975: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883007.31047: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883007.32971: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883007.33020: stderr chunk (state=3): >>><<< 28983 1726883007.33023: stdout chunk (state=3): >>><<< 28983 1726883007.33038: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883007.33046: handler run complete 28983 1726883007.33069: Evaluated conditional (False): False 28983 1726883007.33081: attempt loop complete, returning result 28983 1726883007.33084: _execute() done 28983 1726883007.33094: dumping result to json 28983 1726883007.33096: done dumping result, returning 28983 1726883007.33105: done running TaskExecutor() for managed_node2/TASK: Gather current interface info [0affe814-3a2d-b16d-c0a7-000000000aad] 28983 1726883007.33110: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000aad 28983 1726883007.33219: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000aad 28983 1726883007.33222: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003664", "end": "2024-09-20 21:43:27.285554", "rc": 0, "start": "2024-09-20 21:43:27.281890" } STDOUT: bonding_masters eth0 lo 28983 1726883007.33320: no more pending results, returning what we have 28983 1726883007.33324: results queue empty 28983 1726883007.33325: checking for any_errors_fatal 28983 1726883007.33327: done checking for any_errors_fatal 28983 1726883007.33328: checking for max_fail_percentage 28983 1726883007.33330: done checking for max_fail_percentage 28983 1726883007.33333: checking to see if all hosts have failed and the running result is not ok 28983 1726883007.33334: done checking to see if all hosts have failed 28983 1726883007.33334: getting the remaining hosts for this loop 28983 1726883007.33337: done getting the remaining hosts for this loop 28983 1726883007.33343: getting the next task for host managed_node2 28983 1726883007.33352: done getting next task for host managed_node2 28983 1726883007.33356: ^ task is: TASK: Set current_interfaces 28983 1726883007.33363: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883007.33367: getting variables 28983 1726883007.33368: in VariableManager get_vars() 28983 1726883007.33403: Calling all_inventory to load vars for managed_node2 28983 1726883007.33407: Calling groups_inventory to load vars for managed_node2 28983 1726883007.33410: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883007.33420: Calling all_plugins_play to load vars for managed_node2 28983 1726883007.33422: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883007.33425: Calling groups_plugins_play to load vars for managed_node2 28983 1726883007.34780: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883007.36386: done with get_vars() 28983 1726883007.36408: done getting variables 28983 1726883007.36458: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 21:43:27 -0400 (0:00:00.399) 0:00:37.362 ****** 28983 1726883007.36490: entering _queue_task() for managed_node2/set_fact 28983 1726883007.36724: worker is 1 (out of 1 available) 28983 1726883007.36738: exiting _queue_task() for managed_node2/set_fact 28983 1726883007.36753: done queuing things up, now waiting for results queue to drain 28983 1726883007.36754: waiting for pending results... 28983 1726883007.36955: running TaskExecutor() for managed_node2/TASK: Set current_interfaces 28983 1726883007.37060: in run() - task 0affe814-3a2d-b16d-c0a7-000000000aae 28983 1726883007.37077: variable 'ansible_search_path' from source: unknown 28983 1726883007.37080: variable 'ansible_search_path' from source: unknown 28983 1726883007.37114: calling self._execute() 28983 1726883007.37195: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883007.37199: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883007.37214: variable 'omit' from source: magic vars 28983 1726883007.37540: variable 'ansible_distribution_major_version' from source: facts 28983 1726883007.37551: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883007.37557: variable 'omit' from source: magic vars 28983 1726883007.37600: variable 'omit' from source: magic vars 28983 1726883007.37689: variable '_current_interfaces' from source: set_fact 28983 1726883007.37741: variable 'omit' from source: magic vars 28983 1726883007.37780: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883007.37811: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883007.37829: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883007.37847: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883007.37856: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883007.37889: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883007.37893: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883007.37896: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883007.37978: Set connection var ansible_connection to ssh 28983 1726883007.37988: Set connection var ansible_shell_executable to /bin/sh 28983 1726883007.38040: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883007.38044: Set connection var ansible_timeout to 10 28983 1726883007.38046: Set connection var ansible_pipelining to False 28983 1726883007.38049: Set connection var ansible_shell_type to sh 28983 1726883007.38051: variable 'ansible_shell_executable' from source: unknown 28983 1726883007.38053: variable 'ansible_connection' from source: unknown 28983 1726883007.38055: variable 'ansible_module_compression' from source: unknown 28983 1726883007.38058: variable 'ansible_shell_type' from source: unknown 28983 1726883007.38061: variable 'ansible_shell_executable' from source: unknown 28983 1726883007.38063: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883007.38065: variable 'ansible_pipelining' from source: unknown 28983 1726883007.38067: variable 'ansible_timeout' from source: unknown 28983 1726883007.38069: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883007.38205: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883007.38227: variable 'omit' from source: magic vars 28983 1726883007.38243: starting attempt loop 28983 1726883007.38252: running the handler 28983 1726883007.38272: handler run complete 28983 1726883007.38275: attempt loop complete, returning result 28983 1726883007.38296: _execute() done 28983 1726883007.38299: dumping result to json 28983 1726883007.38301: done dumping result, returning 28983 1726883007.38311: done running TaskExecutor() for managed_node2/TASK: Set current_interfaces [0affe814-3a2d-b16d-c0a7-000000000aae] 28983 1726883007.38317: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000aae 28983 1726883007.38419: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000aae 28983 1726883007.38422: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 28983 1726883007.38542: no more pending results, returning what we have 28983 1726883007.38545: results queue empty 28983 1726883007.38546: checking for any_errors_fatal 28983 1726883007.38552: done checking for any_errors_fatal 28983 1726883007.38553: checking for max_fail_percentage 28983 1726883007.38555: done checking for max_fail_percentage 28983 1726883007.38557: checking to see if all hosts have failed and the running result is not ok 28983 1726883007.38558: done checking to see if all hosts have failed 28983 1726883007.38559: getting the remaining hosts for this loop 28983 1726883007.38561: done getting the remaining hosts for this loop 28983 1726883007.38565: getting the next task for host managed_node2 28983 1726883007.38573: done getting next task for host managed_node2 28983 1726883007.38576: ^ task is: TASK: Show current_interfaces 28983 1726883007.38581: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883007.38584: getting variables 28983 1726883007.38586: in VariableManager get_vars() 28983 1726883007.38614: Calling all_inventory to load vars for managed_node2 28983 1726883007.38617: Calling groups_inventory to load vars for managed_node2 28983 1726883007.38620: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883007.38630: Calling all_plugins_play to load vars for managed_node2 28983 1726883007.38632: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883007.38637: Calling groups_plugins_play to load vars for managed_node2 28983 1726883007.39928: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883007.41539: done with get_vars() 28983 1726883007.41561: done getting variables 28983 1726883007.41608: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 21:43:27 -0400 (0:00:00.051) 0:00:37.414 ****** 28983 1726883007.41632: entering _queue_task() for managed_node2/debug 28983 1726883007.41850: worker is 1 (out of 1 available) 28983 1726883007.41862: exiting _queue_task() for managed_node2/debug 28983 1726883007.41877: done queuing things up, now waiting for results queue to drain 28983 1726883007.41879: waiting for pending results... 28983 1726883007.42076: running TaskExecutor() for managed_node2/TASK: Show current_interfaces 28983 1726883007.42161: in run() - task 0affe814-3a2d-b16d-c0a7-000000000a73 28983 1726883007.42177: variable 'ansible_search_path' from source: unknown 28983 1726883007.42180: variable 'ansible_search_path' from source: unknown 28983 1726883007.42213: calling self._execute() 28983 1726883007.42293: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883007.42297: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883007.42309: variable 'omit' from source: magic vars 28983 1726883007.42633: variable 'ansible_distribution_major_version' from source: facts 28983 1726883007.42645: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883007.42652: variable 'omit' from source: magic vars 28983 1726883007.42696: variable 'omit' from source: magic vars 28983 1726883007.42778: variable 'current_interfaces' from source: set_fact 28983 1726883007.42799: variable 'omit' from source: magic vars 28983 1726883007.42835: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883007.42867: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883007.42887: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883007.42905: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883007.42915: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883007.42943: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883007.42947: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883007.42952: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883007.43036: Set connection var ansible_connection to ssh 28983 1726883007.43046: Set connection var ansible_shell_executable to /bin/sh 28983 1726883007.43055: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883007.43064: Set connection var ansible_timeout to 10 28983 1726883007.43070: Set connection var ansible_pipelining to False 28983 1726883007.43075: Set connection var ansible_shell_type to sh 28983 1726883007.43094: variable 'ansible_shell_executable' from source: unknown 28983 1726883007.43097: variable 'ansible_connection' from source: unknown 28983 1726883007.43100: variable 'ansible_module_compression' from source: unknown 28983 1726883007.43102: variable 'ansible_shell_type' from source: unknown 28983 1726883007.43114: variable 'ansible_shell_executable' from source: unknown 28983 1726883007.43117: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883007.43119: variable 'ansible_pipelining' from source: unknown 28983 1726883007.43121: variable 'ansible_timeout' from source: unknown 28983 1726883007.43123: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883007.43239: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883007.43250: variable 'omit' from source: magic vars 28983 1726883007.43256: starting attempt loop 28983 1726883007.43258: running the handler 28983 1726883007.43303: handler run complete 28983 1726883007.43315: attempt loop complete, returning result 28983 1726883007.43318: _execute() done 28983 1726883007.43327: dumping result to json 28983 1726883007.43332: done dumping result, returning 28983 1726883007.43335: done running TaskExecutor() for managed_node2/TASK: Show current_interfaces [0affe814-3a2d-b16d-c0a7-000000000a73] 28983 1726883007.43343: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000a73 28983 1726883007.43427: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000a73 28983 1726883007.43432: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 28983 1726883007.43485: no more pending results, returning what we have 28983 1726883007.43489: results queue empty 28983 1726883007.43490: checking for any_errors_fatal 28983 1726883007.43497: done checking for any_errors_fatal 28983 1726883007.43498: checking for max_fail_percentage 28983 1726883007.43499: done checking for max_fail_percentage 28983 1726883007.43500: checking to see if all hosts have failed and the running result is not ok 28983 1726883007.43501: done checking to see if all hosts have failed 28983 1726883007.43502: getting the remaining hosts for this loop 28983 1726883007.43504: done getting the remaining hosts for this loop 28983 1726883007.43508: getting the next task for host managed_node2 28983 1726883007.43520: done getting next task for host managed_node2 28983 1726883007.43525: ^ task is: TASK: Setup 28983 1726883007.43528: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883007.43533: getting variables 28983 1726883007.43624: in VariableManager get_vars() 28983 1726883007.43657: Calling all_inventory to load vars for managed_node2 28983 1726883007.43673: Calling groups_inventory to load vars for managed_node2 28983 1726883007.43677: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883007.43688: Calling all_plugins_play to load vars for managed_node2 28983 1726883007.43691: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883007.43695: Calling groups_plugins_play to load vars for managed_node2 28983 1726883007.45365: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883007.47112: done with get_vars() 28983 1726883007.47146: done getting variables TASK [Setup] ******************************************************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:24 Friday 20 September 2024 21:43:27 -0400 (0:00:00.056) 0:00:37.470 ****** 28983 1726883007.47248: entering _queue_task() for managed_node2/include_tasks 28983 1726883007.47506: worker is 1 (out of 1 available) 28983 1726883007.47519: exiting _queue_task() for managed_node2/include_tasks 28983 1726883007.47532: done queuing things up, now waiting for results queue to drain 28983 1726883007.47535: waiting for pending results... 28983 1726883007.47952: running TaskExecutor() for managed_node2/TASK: Setup 28983 1726883007.47977: in run() - task 0affe814-3a2d-b16d-c0a7-000000000a4c 28983 1726883007.47997: variable 'ansible_search_path' from source: unknown 28983 1726883007.48004: variable 'ansible_search_path' from source: unknown 28983 1726883007.48055: variable 'lsr_setup' from source: include params 28983 1726883007.48292: variable 'lsr_setup' from source: include params 28983 1726883007.48366: variable 'omit' from source: magic vars 28983 1726883007.48530: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883007.48551: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883007.48567: variable 'omit' from source: magic vars 28983 1726883007.48878: variable 'ansible_distribution_major_version' from source: facts 28983 1726883007.48938: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883007.48941: variable 'item' from source: unknown 28983 1726883007.48996: variable 'item' from source: unknown 28983 1726883007.49042: variable 'item' from source: unknown 28983 1726883007.49126: variable 'item' from source: unknown 28983 1726883007.49443: dumping result to json 28983 1726883007.49447: done dumping result, returning 28983 1726883007.49450: done running TaskExecutor() for managed_node2/TASK: Setup [0affe814-3a2d-b16d-c0a7-000000000a4c] 28983 1726883007.49454: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000a4c 28983 1726883007.49501: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000a4c 28983 1726883007.49504: WORKER PROCESS EXITING 28983 1726883007.49575: no more pending results, returning what we have 28983 1726883007.49580: in VariableManager get_vars() 28983 1726883007.49616: Calling all_inventory to load vars for managed_node2 28983 1726883007.49619: Calling groups_inventory to load vars for managed_node2 28983 1726883007.49623: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883007.49637: Calling all_plugins_play to load vars for managed_node2 28983 1726883007.49640: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883007.49645: Calling groups_plugins_play to load vars for managed_node2 28983 1726883007.52058: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883007.55015: done with get_vars() 28983 1726883007.55052: variable 'ansible_search_path' from source: unknown 28983 1726883007.55054: variable 'ansible_search_path' from source: unknown 28983 1726883007.55104: we have included files to process 28983 1726883007.55106: generating all_blocks data 28983 1726883007.55108: done generating all_blocks data 28983 1726883007.55114: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 28983 1726883007.55116: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 28983 1726883007.55119: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 28983 1726883007.55431: done processing included file 28983 1726883007.55435: iterating over new_blocks loaded from include file 28983 1726883007.55437: in VariableManager get_vars() 28983 1726883007.55456: done with get_vars() 28983 1726883007.55459: filtering new block on tags 28983 1726883007.55508: done filtering new block on tags 28983 1726883007.55511: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml for managed_node2 => (item=tasks/create_bridge_profile.yml) 28983 1726883007.55517: extending task lists for all hosts with included blocks 28983 1726883007.56398: done extending task lists 28983 1726883007.56400: done processing included files 28983 1726883007.56401: results queue empty 28983 1726883007.56402: checking for any_errors_fatal 28983 1726883007.56406: done checking for any_errors_fatal 28983 1726883007.56407: checking for max_fail_percentage 28983 1726883007.56408: done checking for max_fail_percentage 28983 1726883007.56409: checking to see if all hosts have failed and the running result is not ok 28983 1726883007.56410: done checking to see if all hosts have failed 28983 1726883007.56411: getting the remaining hosts for this loop 28983 1726883007.56413: done getting the remaining hosts for this loop 28983 1726883007.56416: getting the next task for host managed_node2 28983 1726883007.56422: done getting next task for host managed_node2 28983 1726883007.56424: ^ task is: TASK: Include network role 28983 1726883007.56427: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883007.56430: getting variables 28983 1726883007.56431: in VariableManager get_vars() 28983 1726883007.56445: Calling all_inventory to load vars for managed_node2 28983 1726883007.56448: Calling groups_inventory to load vars for managed_node2 28983 1726883007.56451: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883007.56457: Calling all_plugins_play to load vars for managed_node2 28983 1726883007.56460: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883007.56464: Calling groups_plugins_play to load vars for managed_node2 28983 1726883007.58574: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883007.61482: done with get_vars() 28983 1726883007.61515: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml:3 Friday 20 September 2024 21:43:27 -0400 (0:00:00.143) 0:00:37.613 ****** 28983 1726883007.61608: entering _queue_task() for managed_node2/include_role 28983 1726883007.61982: worker is 1 (out of 1 available) 28983 1726883007.61998: exiting _queue_task() for managed_node2/include_role 28983 1726883007.62013: done queuing things up, now waiting for results queue to drain 28983 1726883007.62015: waiting for pending results... 28983 1726883007.62355: running TaskExecutor() for managed_node2/TASK: Include network role 28983 1726883007.62533: in run() - task 0affe814-3a2d-b16d-c0a7-000000000ad1 28983 1726883007.62539: variable 'ansible_search_path' from source: unknown 28983 1726883007.62542: variable 'ansible_search_path' from source: unknown 28983 1726883007.62544: calling self._execute() 28983 1726883007.62644: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883007.62659: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883007.62677: variable 'omit' from source: magic vars 28983 1726883007.63122: variable 'ansible_distribution_major_version' from source: facts 28983 1726883007.63143: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883007.63155: _execute() done 28983 1726883007.63186: dumping result to json 28983 1726883007.63189: done dumping result, returning 28983 1726883007.63192: done running TaskExecutor() for managed_node2/TASK: Include network role [0affe814-3a2d-b16d-c0a7-000000000ad1] 28983 1726883007.63195: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000ad1 28983 1726883007.63444: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000ad1 28983 1726883007.63448: WORKER PROCESS EXITING 28983 1726883007.63484: no more pending results, returning what we have 28983 1726883007.63490: in VariableManager get_vars() 28983 1726883007.63528: Calling all_inventory to load vars for managed_node2 28983 1726883007.63532: Calling groups_inventory to load vars for managed_node2 28983 1726883007.63537: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883007.63548: Calling all_plugins_play to load vars for managed_node2 28983 1726883007.63551: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883007.63554: Calling groups_plugins_play to load vars for managed_node2 28983 1726883007.65765: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883007.68913: done with get_vars() 28983 1726883007.68948: variable 'ansible_search_path' from source: unknown 28983 1726883007.68950: variable 'ansible_search_path' from source: unknown 28983 1726883007.69213: variable 'omit' from source: magic vars 28983 1726883007.69266: variable 'omit' from source: magic vars 28983 1726883007.69289: variable 'omit' from source: magic vars 28983 1726883007.69294: we have included files to process 28983 1726883007.69295: generating all_blocks data 28983 1726883007.69297: done generating all_blocks data 28983 1726883007.69299: processing included file: fedora.linux_system_roles.network 28983 1726883007.69323: in VariableManager get_vars() 28983 1726883007.69340: done with get_vars() 28983 1726883007.69372: in VariableManager get_vars() 28983 1726883007.69393: done with get_vars() 28983 1726883007.69441: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 28983 1726883007.69614: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 28983 1726883007.69735: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 28983 1726883007.70387: in VariableManager get_vars() 28983 1726883007.70412: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 28983 1726883007.73095: iterating over new_blocks loaded from include file 28983 1726883007.73097: in VariableManager get_vars() 28983 1726883007.73118: done with get_vars() 28983 1726883007.73120: filtering new block on tags 28983 1726883007.73571: done filtering new block on tags 28983 1726883007.73576: in VariableManager get_vars() 28983 1726883007.73596: done with get_vars() 28983 1726883007.73598: filtering new block on tags 28983 1726883007.73627: done filtering new block on tags 28983 1726883007.73630: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node2 28983 1726883007.73638: extending task lists for all hosts with included blocks 28983 1726883007.73872: done extending task lists 28983 1726883007.73874: done processing included files 28983 1726883007.73875: results queue empty 28983 1726883007.73876: checking for any_errors_fatal 28983 1726883007.73879: done checking for any_errors_fatal 28983 1726883007.73880: checking for max_fail_percentage 28983 1726883007.73882: done checking for max_fail_percentage 28983 1726883007.73883: checking to see if all hosts have failed and the running result is not ok 28983 1726883007.73884: done checking to see if all hosts have failed 28983 1726883007.73885: getting the remaining hosts for this loop 28983 1726883007.73886: done getting the remaining hosts for this loop 28983 1726883007.73889: getting the next task for host managed_node2 28983 1726883007.73895: done getting next task for host managed_node2 28983 1726883007.73898: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 28983 1726883007.73902: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883007.73914: getting variables 28983 1726883007.73915: in VariableManager get_vars() 28983 1726883007.73936: Calling all_inventory to load vars for managed_node2 28983 1726883007.73940: Calling groups_inventory to load vars for managed_node2 28983 1726883007.73944: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883007.73950: Calling all_plugins_play to load vars for managed_node2 28983 1726883007.73954: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883007.73958: Calling groups_plugins_play to load vars for managed_node2 28983 1726883007.75957: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883007.79127: done with get_vars() 28983 1726883007.79169: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:43:27 -0400 (0:00:00.176) 0:00:37.790 ****** 28983 1726883007.79265: entering _queue_task() for managed_node2/include_tasks 28983 1726883007.79674: worker is 1 (out of 1 available) 28983 1726883007.79691: exiting _queue_task() for managed_node2/include_tasks 28983 1726883007.79704: done queuing things up, now waiting for results queue to drain 28983 1726883007.79706: waiting for pending results... 28983 1726883007.79951: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 28983 1726883007.80241: in run() - task 0affe814-3a2d-b16d-c0a7-000000000b33 28983 1726883007.80246: variable 'ansible_search_path' from source: unknown 28983 1726883007.80248: variable 'ansible_search_path' from source: unknown 28983 1726883007.80251: calling self._execute() 28983 1726883007.80324: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883007.80342: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883007.80361: variable 'omit' from source: magic vars 28983 1726883007.80820: variable 'ansible_distribution_major_version' from source: facts 28983 1726883007.80840: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883007.80853: _execute() done 28983 1726883007.80863: dumping result to json 28983 1726883007.80871: done dumping result, returning 28983 1726883007.80886: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affe814-3a2d-b16d-c0a7-000000000b33] 28983 1726883007.80897: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000b33 28983 1726883007.81066: no more pending results, returning what we have 28983 1726883007.81072: in VariableManager get_vars() 28983 1726883007.81117: Calling all_inventory to load vars for managed_node2 28983 1726883007.81120: Calling groups_inventory to load vars for managed_node2 28983 1726883007.81123: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883007.81137: Calling all_plugins_play to load vars for managed_node2 28983 1726883007.81141: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883007.81146: Calling groups_plugins_play to load vars for managed_node2 28983 1726883007.81666: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000b33 28983 1726883007.81670: WORKER PROCESS EXITING 28983 1726883007.83447: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883007.86540: done with get_vars() 28983 1726883007.86575: variable 'ansible_search_path' from source: unknown 28983 1726883007.86577: variable 'ansible_search_path' from source: unknown 28983 1726883007.86624: we have included files to process 28983 1726883007.86625: generating all_blocks data 28983 1726883007.86628: done generating all_blocks data 28983 1726883007.86632: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 28983 1726883007.86633: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 28983 1726883007.86640: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 28983 1726883007.87394: done processing included file 28983 1726883007.87397: iterating over new_blocks loaded from include file 28983 1726883007.87399: in VariableManager get_vars() 28983 1726883007.87429: done with get_vars() 28983 1726883007.87431: filtering new block on tags 28983 1726883007.87479: done filtering new block on tags 28983 1726883007.87482: in VariableManager get_vars() 28983 1726883007.87510: done with get_vars() 28983 1726883007.87513: filtering new block on tags 28983 1726883007.87580: done filtering new block on tags 28983 1726883007.87584: in VariableManager get_vars() 28983 1726883007.87614: done with get_vars() 28983 1726883007.87616: filtering new block on tags 28983 1726883007.87683: done filtering new block on tags 28983 1726883007.87686: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node2 28983 1726883007.87692: extending task lists for all hosts with included blocks 28983 1726883007.90491: done extending task lists 28983 1726883007.90493: done processing included files 28983 1726883007.90494: results queue empty 28983 1726883007.90495: checking for any_errors_fatal 28983 1726883007.90498: done checking for any_errors_fatal 28983 1726883007.90499: checking for max_fail_percentage 28983 1726883007.90500: done checking for max_fail_percentage 28983 1726883007.90501: checking to see if all hosts have failed and the running result is not ok 28983 1726883007.90510: done checking to see if all hosts have failed 28983 1726883007.90511: getting the remaining hosts for this loop 28983 1726883007.90513: done getting the remaining hosts for this loop 28983 1726883007.90516: getting the next task for host managed_node2 28983 1726883007.90523: done getting next task for host managed_node2 28983 1726883007.90526: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 28983 1726883007.90531: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883007.90545: getting variables 28983 1726883007.90547: in VariableManager get_vars() 28983 1726883007.90563: Calling all_inventory to load vars for managed_node2 28983 1726883007.90566: Calling groups_inventory to load vars for managed_node2 28983 1726883007.90568: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883007.90577: Calling all_plugins_play to load vars for managed_node2 28983 1726883007.90581: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883007.90585: Calling groups_plugins_play to load vars for managed_node2 28983 1726883007.92751: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883007.95909: done with get_vars() 28983 1726883007.95945: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:43:27 -0400 (0:00:00.167) 0:00:37.958 ****** 28983 1726883007.96050: entering _queue_task() for managed_node2/setup 28983 1726883007.96449: worker is 1 (out of 1 available) 28983 1726883007.96463: exiting _queue_task() for managed_node2/setup 28983 1726883007.96479: done queuing things up, now waiting for results queue to drain 28983 1726883007.96482: waiting for pending results... 28983 1726883007.96804: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 28983 1726883007.97013: in run() - task 0affe814-3a2d-b16d-c0a7-000000000b90 28983 1726883007.97040: variable 'ansible_search_path' from source: unknown 28983 1726883007.97051: variable 'ansible_search_path' from source: unknown 28983 1726883007.97103: calling self._execute() 28983 1726883007.97226: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883007.97290: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883007.97294: variable 'omit' from source: magic vars 28983 1726883007.97736: variable 'ansible_distribution_major_version' from source: facts 28983 1726883007.97756: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883007.98054: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883008.00749: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883008.00839: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883008.00890: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883008.01030: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883008.01036: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883008.01083: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883008.01124: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883008.01167: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883008.01227: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883008.01255: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883008.01329: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883008.01377: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883008.01411: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883008.01468: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883008.01491: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883008.01697: variable '__network_required_facts' from source: role '' defaults 28983 1726883008.01712: variable 'ansible_facts' from source: unknown 28983 1726883008.02887: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 28983 1726883008.02940: when evaluation is False, skipping this task 28983 1726883008.02943: _execute() done 28983 1726883008.02946: dumping result to json 28983 1726883008.02949: done dumping result, returning 28983 1726883008.02951: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affe814-3a2d-b16d-c0a7-000000000b90] 28983 1726883008.02953: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000b90 28983 1726883008.03242: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000b90 28983 1726883008.03246: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28983 1726883008.03298: no more pending results, returning what we have 28983 1726883008.03302: results queue empty 28983 1726883008.03303: checking for any_errors_fatal 28983 1726883008.03305: done checking for any_errors_fatal 28983 1726883008.03306: checking for max_fail_percentage 28983 1726883008.03308: done checking for max_fail_percentage 28983 1726883008.03309: checking to see if all hosts have failed and the running result is not ok 28983 1726883008.03310: done checking to see if all hosts have failed 28983 1726883008.03311: getting the remaining hosts for this loop 28983 1726883008.03313: done getting the remaining hosts for this loop 28983 1726883008.03318: getting the next task for host managed_node2 28983 1726883008.03330: done getting next task for host managed_node2 28983 1726883008.03337: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 28983 1726883008.03344: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883008.03365: getting variables 28983 1726883008.03367: in VariableManager get_vars() 28983 1726883008.03411: Calling all_inventory to load vars for managed_node2 28983 1726883008.03414: Calling groups_inventory to load vars for managed_node2 28983 1726883008.03416: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883008.03425: Calling all_plugins_play to load vars for managed_node2 28983 1726883008.03429: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883008.03547: Calling groups_plugins_play to load vars for managed_node2 28983 1726883008.06711: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883008.11416: done with get_vars() 28983 1726883008.11455: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:43:28 -0400 (0:00:00.155) 0:00:38.113 ****** 28983 1726883008.11581: entering _queue_task() for managed_node2/stat 28983 1726883008.11962: worker is 1 (out of 1 available) 28983 1726883008.11978: exiting _queue_task() for managed_node2/stat 28983 1726883008.11994: done queuing things up, now waiting for results queue to drain 28983 1726883008.11996: waiting for pending results... 28983 1726883008.12337: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 28983 1726883008.12548: in run() - task 0affe814-3a2d-b16d-c0a7-000000000b92 28983 1726883008.12577: variable 'ansible_search_path' from source: unknown 28983 1726883008.12587: variable 'ansible_search_path' from source: unknown 28983 1726883008.12637: calling self._execute() 28983 1726883008.12757: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883008.12775: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883008.12797: variable 'omit' from source: magic vars 28983 1726883008.13268: variable 'ansible_distribution_major_version' from source: facts 28983 1726883008.13291: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883008.13640: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883008.14129: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883008.14297: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883008.14538: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883008.14542: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883008.14787: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883008.14824: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883008.14939: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883008.14943: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883008.15018: variable '__network_is_ostree' from source: set_fact 28983 1726883008.15051: Evaluated conditional (not __network_is_ostree is defined): False 28983 1726883008.15075: when evaluation is False, skipping this task 28983 1726883008.15084: _execute() done 28983 1726883008.15093: dumping result to json 28983 1726883008.15103: done dumping result, returning 28983 1726883008.15116: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affe814-3a2d-b16d-c0a7-000000000b92] 28983 1726883008.15126: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000b92 28983 1726883008.15278: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000b92 28983 1726883008.15289: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 28983 1726883008.15355: no more pending results, returning what we have 28983 1726883008.15359: results queue empty 28983 1726883008.15360: checking for any_errors_fatal 28983 1726883008.15370: done checking for any_errors_fatal 28983 1726883008.15371: checking for max_fail_percentage 28983 1726883008.15373: done checking for max_fail_percentage 28983 1726883008.15374: checking to see if all hosts have failed and the running result is not ok 28983 1726883008.15375: done checking to see if all hosts have failed 28983 1726883008.15376: getting the remaining hosts for this loop 28983 1726883008.15378: done getting the remaining hosts for this loop 28983 1726883008.15388: getting the next task for host managed_node2 28983 1726883008.15399: done getting next task for host managed_node2 28983 1726883008.15403: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 28983 1726883008.15410: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883008.15436: getting variables 28983 1726883008.15438: in VariableManager get_vars() 28983 1726883008.15479: Calling all_inventory to load vars for managed_node2 28983 1726883008.15482: Calling groups_inventory to load vars for managed_node2 28983 1726883008.15485: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883008.15495: Calling all_plugins_play to load vars for managed_node2 28983 1726883008.15499: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883008.15502: Calling groups_plugins_play to load vars for managed_node2 28983 1726883008.17131: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883008.20072: done with get_vars() 28983 1726883008.20098: done getting variables 28983 1726883008.20147: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:43:28 -0400 (0:00:00.085) 0:00:38.199 ****** 28983 1726883008.20178: entering _queue_task() for managed_node2/set_fact 28983 1726883008.20430: worker is 1 (out of 1 available) 28983 1726883008.20448: exiting _queue_task() for managed_node2/set_fact 28983 1726883008.20462: done queuing things up, now waiting for results queue to drain 28983 1726883008.20465: waiting for pending results... 28983 1726883008.20669: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 28983 1726883008.20856: in run() - task 0affe814-3a2d-b16d-c0a7-000000000b93 28983 1726883008.20877: variable 'ansible_search_path' from source: unknown 28983 1726883008.20882: variable 'ansible_search_path' from source: unknown 28983 1726883008.20909: calling self._execute() 28983 1726883008.21043: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883008.21047: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883008.21051: variable 'omit' from source: magic vars 28983 1726883008.21432: variable 'ansible_distribution_major_version' from source: facts 28983 1726883008.21612: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883008.21856: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883008.22580: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883008.22648: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883008.22840: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883008.22886: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883008.23070: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883008.23172: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883008.23272: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883008.23376: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883008.23618: variable '__network_is_ostree' from source: set_fact 28983 1726883008.23636: Evaluated conditional (not __network_is_ostree is defined): False 28983 1726883008.23645: when evaluation is False, skipping this task 28983 1726883008.23653: _execute() done 28983 1726883008.23661: dumping result to json 28983 1726883008.23888: done dumping result, returning 28983 1726883008.23892: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affe814-3a2d-b16d-c0a7-000000000b93] 28983 1726883008.23895: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000b93 28983 1726883008.23969: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000b93 28983 1726883008.23973: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 28983 1726883008.24053: no more pending results, returning what we have 28983 1726883008.24058: results queue empty 28983 1726883008.24059: checking for any_errors_fatal 28983 1726883008.24068: done checking for any_errors_fatal 28983 1726883008.24069: checking for max_fail_percentage 28983 1726883008.24071: done checking for max_fail_percentage 28983 1726883008.24072: checking to see if all hosts have failed and the running result is not ok 28983 1726883008.24073: done checking to see if all hosts have failed 28983 1726883008.24074: getting the remaining hosts for this loop 28983 1726883008.24077: done getting the remaining hosts for this loop 28983 1726883008.24083: getting the next task for host managed_node2 28983 1726883008.24096: done getting next task for host managed_node2 28983 1726883008.24101: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 28983 1726883008.24109: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883008.24138: getting variables 28983 1726883008.24140: in VariableManager get_vars() 28983 1726883008.24186: Calling all_inventory to load vars for managed_node2 28983 1726883008.24190: Calling groups_inventory to load vars for managed_node2 28983 1726883008.24193: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883008.24205: Calling all_plugins_play to load vars for managed_node2 28983 1726883008.24210: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883008.24214: Calling groups_plugins_play to load vars for managed_node2 28983 1726883008.27250: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883008.30859: done with get_vars() 28983 1726883008.30898: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:43:28 -0400 (0:00:00.108) 0:00:38.307 ****** 28983 1726883008.31019: entering _queue_task() for managed_node2/service_facts 28983 1726883008.31779: worker is 1 (out of 1 available) 28983 1726883008.31791: exiting _queue_task() for managed_node2/service_facts 28983 1726883008.31803: done queuing things up, now waiting for results queue to drain 28983 1726883008.31805: waiting for pending results... 28983 1726883008.32254: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running 28983 1726883008.32603: in run() - task 0affe814-3a2d-b16d-c0a7-000000000b95 28983 1726883008.32627: variable 'ansible_search_path' from source: unknown 28983 1726883008.32639: variable 'ansible_search_path' from source: unknown 28983 1726883008.32688: calling self._execute() 28983 1726883008.33244: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883008.33327: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883008.33332: variable 'omit' from source: magic vars 28983 1726883008.34291: variable 'ansible_distribution_major_version' from source: facts 28983 1726883008.34295: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883008.34298: variable 'omit' from source: magic vars 28983 1726883008.34538: variable 'omit' from source: magic vars 28983 1726883008.34657: variable 'omit' from source: magic vars 28983 1726883008.34780: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883008.34832: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883008.34998: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883008.35002: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883008.35106: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883008.35114: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883008.35149: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883008.35159: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883008.35445: Set connection var ansible_connection to ssh 28983 1726883008.35516: Set connection var ansible_shell_executable to /bin/sh 28983 1726883008.35540: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883008.35559: Set connection var ansible_timeout to 10 28983 1726883008.35652: Set connection var ansible_pipelining to False 28983 1726883008.35655: Set connection var ansible_shell_type to sh 28983 1726883008.35667: variable 'ansible_shell_executable' from source: unknown 28983 1726883008.35760: variable 'ansible_connection' from source: unknown 28983 1726883008.35765: variable 'ansible_module_compression' from source: unknown 28983 1726883008.35768: variable 'ansible_shell_type' from source: unknown 28983 1726883008.35770: variable 'ansible_shell_executable' from source: unknown 28983 1726883008.35775: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883008.35778: variable 'ansible_pipelining' from source: unknown 28983 1726883008.35780: variable 'ansible_timeout' from source: unknown 28983 1726883008.35870: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883008.36351: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28983 1726883008.36429: variable 'omit' from source: magic vars 28983 1726883008.36433: starting attempt loop 28983 1726883008.36437: running the handler 28983 1726883008.36439: _low_level_execute_command(): starting 28983 1726883008.36441: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726883008.37252: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883008.37319: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883008.37350: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883008.37370: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883008.37479: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883008.39278: stdout chunk (state=3): >>>/root <<< 28983 1726883008.39460: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883008.39493: stderr chunk (state=3): >>><<< 28983 1726883008.39505: stdout chunk (state=3): >>><<< 28983 1726883008.39539: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883008.39580: _low_level_execute_command(): starting 28983 1726883008.39592: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726883008.3954706-30433-149629301828274 `" && echo ansible-tmp-1726883008.3954706-30433-149629301828274="` echo /root/.ansible/tmp/ansible-tmp-1726883008.3954706-30433-149629301828274 `" ) && sleep 0' 28983 1726883008.40264: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883008.40283: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883008.40312: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883008.40331: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883008.40423: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883008.40477: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883008.40495: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883008.40525: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883008.40643: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883008.42708: stdout chunk (state=3): >>>ansible-tmp-1726883008.3954706-30433-149629301828274=/root/.ansible/tmp/ansible-tmp-1726883008.3954706-30433-149629301828274 <<< 28983 1726883008.42924: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883008.42928: stdout chunk (state=3): >>><<< 28983 1726883008.42931: stderr chunk (state=3): >>><<< 28983 1726883008.43140: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726883008.3954706-30433-149629301828274=/root/.ansible/tmp/ansible-tmp-1726883008.3954706-30433-149629301828274 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883008.43145: variable 'ansible_module_compression' from source: unknown 28983 1726883008.43147: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 28983 1726883008.43149: variable 'ansible_facts' from source: unknown 28983 1726883008.43208: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726883008.3954706-30433-149629301828274/AnsiballZ_service_facts.py 28983 1726883008.43400: Sending initial data 28983 1726883008.43409: Sent initial data (162 bytes) 28983 1726883008.44066: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883008.44086: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883008.44161: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883008.44224: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883008.44270: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883008.44300: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883008.44378: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883008.46054: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726883008.46140: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726883008.46249: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpnqngn18a /root/.ansible/tmp/ansible-tmp-1726883008.3954706-30433-149629301828274/AnsiballZ_service_facts.py <<< 28983 1726883008.46252: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726883008.3954706-30433-149629301828274/AnsiballZ_service_facts.py" <<< 28983 1726883008.46360: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpnqngn18a" to remote "/root/.ansible/tmp/ansible-tmp-1726883008.3954706-30433-149629301828274/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726883008.3954706-30433-149629301828274/AnsiballZ_service_facts.py" <<< 28983 1726883008.47706: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883008.47739: stderr chunk (state=3): >>><<< 28983 1726883008.47867: stdout chunk (state=3): >>><<< 28983 1726883008.47874: done transferring module to remote 28983 1726883008.47877: _low_level_execute_command(): starting 28983 1726883008.47880: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726883008.3954706-30433-149629301828274/ /root/.ansible/tmp/ansible-tmp-1726883008.3954706-30433-149629301828274/AnsiballZ_service_facts.py && sleep 0' 28983 1726883008.48448: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883008.48474: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883008.48496: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883008.48550: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883008.48636: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883008.48658: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883008.48760: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883008.50759: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883008.50767: stdout chunk (state=3): >>><<< 28983 1726883008.50770: stderr chunk (state=3): >>><<< 28983 1726883008.50880: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883008.50886: _low_level_execute_command(): starting 28983 1726883008.50890: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726883008.3954706-30433-149629301828274/AnsiballZ_service_facts.py && sleep 0' 28983 1726883008.51557: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883008.51595: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883008.51612: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883008.51633: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883008.51748: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883010.52718: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"network": {"name": "network", "state": "running", "status": "enabled", "source": "sysv"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "import-state.service": {"name": "import-state.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "ip6tables.service": {"name": "ip6tables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "iptables.service": {"name": "iptables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "generated", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "<<< 28983 1726883010.52810: stdout chunk (state=3): >>>source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "loadmodules.service": {"name": "loadmodules.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "<<< 28983 1726883010.52838: stdout chunk (state=3): >>>systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 28983 1726883010.54370: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883010.54495: stderr chunk (state=3): >>>Shared connection to 10.31.46.139 closed. <<< 28983 1726883010.54499: stdout chunk (state=3): >>><<< 28983 1726883010.54501: stderr chunk (state=3): >>><<< 28983 1726883010.54640: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"network": {"name": "network", "state": "running", "status": "enabled", "source": "sysv"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "import-state.service": {"name": "import-state.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "ip6tables.service": {"name": "ip6tables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "iptables.service": {"name": "iptables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "generated", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "loadmodules.service": {"name": "loadmodules.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726883010.55803: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726883008.3954706-30433-149629301828274/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726883010.55821: _low_level_execute_command(): starting 28983 1726883010.55832: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726883008.3954706-30433-149629301828274/ > /dev/null 2>&1 && sleep 0' 28983 1726883010.56617: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883010.56644: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883010.56754: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883010.58821: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883010.58833: stdout chunk (state=3): >>><<< 28983 1726883010.58849: stderr chunk (state=3): >>><<< 28983 1726883010.58869: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883010.58948: handler run complete 28983 1726883010.59184: variable 'ansible_facts' from source: unknown 28983 1726883010.59427: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883010.60240: variable 'ansible_facts' from source: unknown 28983 1726883010.60463: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883010.60853: attempt loop complete, returning result 28983 1726883010.60865: _execute() done 28983 1726883010.60873: dumping result to json 28983 1726883010.60969: done dumping result, returning 28983 1726883010.60983: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running [0affe814-3a2d-b16d-c0a7-000000000b95] 28983 1726883010.60995: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000b95 ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28983 1726883010.62574: no more pending results, returning what we have 28983 1726883010.62577: results queue empty 28983 1726883010.62578: checking for any_errors_fatal 28983 1726883010.62584: done checking for any_errors_fatal 28983 1726883010.62585: checking for max_fail_percentage 28983 1726883010.62588: done checking for max_fail_percentage 28983 1726883010.62589: checking to see if all hosts have failed and the running result is not ok 28983 1726883010.62590: done checking to see if all hosts have failed 28983 1726883010.62591: getting the remaining hosts for this loop 28983 1726883010.62593: done getting the remaining hosts for this loop 28983 1726883010.62597: getting the next task for host managed_node2 28983 1726883010.62606: done getting next task for host managed_node2 28983 1726883010.62610: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 28983 1726883010.62616: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883010.62629: getting variables 28983 1726883010.62631: in VariableManager get_vars() 28983 1726883010.62666: Calling all_inventory to load vars for managed_node2 28983 1726883010.62670: Calling groups_inventory to load vars for managed_node2 28983 1726883010.62672: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883010.62682: Calling all_plugins_play to load vars for managed_node2 28983 1726883010.62685: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883010.62689: Calling groups_plugins_play to load vars for managed_node2 28983 1726883010.63323: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000b95 28983 1726883010.63327: WORKER PROCESS EXITING 28983 1726883010.64948: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883010.67826: done with get_vars() 28983 1726883010.67864: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:43:30 -0400 (0:00:02.369) 0:00:40.677 ****** 28983 1726883010.67985: entering _queue_task() for managed_node2/package_facts 28983 1726883010.68312: worker is 1 (out of 1 available) 28983 1726883010.68327: exiting _queue_task() for managed_node2/package_facts 28983 1726883010.68444: done queuing things up, now waiting for results queue to drain 28983 1726883010.68447: waiting for pending results... 28983 1726883010.68658: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 28983 1726883010.68868: in run() - task 0affe814-3a2d-b16d-c0a7-000000000b96 28983 1726883010.68898: variable 'ansible_search_path' from source: unknown 28983 1726883010.68908: variable 'ansible_search_path' from source: unknown 28983 1726883010.68952: calling self._execute() 28983 1726883010.69067: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883010.69081: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883010.69099: variable 'omit' from source: magic vars 28983 1726883010.69539: variable 'ansible_distribution_major_version' from source: facts 28983 1726883010.69561: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883010.69572: variable 'omit' from source: magic vars 28983 1726883010.69739: variable 'omit' from source: magic vars 28983 1726883010.69743: variable 'omit' from source: magic vars 28983 1726883010.69779: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883010.69825: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883010.69853: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883010.69877: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883010.69899: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883010.69940: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883010.69951: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883010.69959: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883010.70085: Set connection var ansible_connection to ssh 28983 1726883010.70138: Set connection var ansible_shell_executable to /bin/sh 28983 1726883010.70141: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883010.70144: Set connection var ansible_timeout to 10 28983 1726883010.70149: Set connection var ansible_pipelining to False 28983 1726883010.70158: Set connection var ansible_shell_type to sh 28983 1726883010.70191: variable 'ansible_shell_executable' from source: unknown 28983 1726883010.70203: variable 'ansible_connection' from source: unknown 28983 1726883010.70323: variable 'ansible_module_compression' from source: unknown 28983 1726883010.70327: variable 'ansible_shell_type' from source: unknown 28983 1726883010.70329: variable 'ansible_shell_executable' from source: unknown 28983 1726883010.70332: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883010.70336: variable 'ansible_pipelining' from source: unknown 28983 1726883010.70339: variable 'ansible_timeout' from source: unknown 28983 1726883010.70341: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883010.70499: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28983 1726883010.70518: variable 'omit' from source: magic vars 28983 1726883010.70529: starting attempt loop 28983 1726883010.70543: running the handler 28983 1726883010.70562: _low_level_execute_command(): starting 28983 1726883010.70575: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726883010.71437: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883010.71457: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883010.71475: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883010.71499: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883010.71612: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883010.73408: stdout chunk (state=3): >>>/root <<< 28983 1726883010.73589: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883010.73605: stdout chunk (state=3): >>><<< 28983 1726883010.73619: stderr chunk (state=3): >>><<< 28983 1726883010.73647: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883010.73756: _low_level_execute_command(): starting 28983 1726883010.73761: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726883010.7365482-30480-193233249198864 `" && echo ansible-tmp-1726883010.7365482-30480-193233249198864="` echo /root/.ansible/tmp/ansible-tmp-1726883010.7365482-30480-193233249198864 `" ) && sleep 0' 28983 1726883010.74327: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883010.74346: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883010.74361: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883010.74484: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883010.74508: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883010.74541: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883010.74647: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883010.76684: stdout chunk (state=3): >>>ansible-tmp-1726883010.7365482-30480-193233249198864=/root/.ansible/tmp/ansible-tmp-1726883010.7365482-30480-193233249198864 <<< 28983 1726883010.76880: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883010.76884: stdout chunk (state=3): >>><<< 28983 1726883010.76887: stderr chunk (state=3): >>><<< 28983 1726883010.76909: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726883010.7365482-30480-193233249198864=/root/.ansible/tmp/ansible-tmp-1726883010.7365482-30480-193233249198864 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883010.77042: variable 'ansible_module_compression' from source: unknown 28983 1726883010.77045: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 28983 1726883010.77083: variable 'ansible_facts' from source: unknown 28983 1726883010.77272: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726883010.7365482-30480-193233249198864/AnsiballZ_package_facts.py 28983 1726883010.77458: Sending initial data 28983 1726883010.77469: Sent initial data (162 bytes) 28983 1726883010.78155: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883010.78171: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883010.78254: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883010.78298: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883010.78318: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883010.78344: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883010.78456: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883010.80176: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726883010.80252: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726883010.80315: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmph6d0rcwf /root/.ansible/tmp/ansible-tmp-1726883010.7365482-30480-193233249198864/AnsiballZ_package_facts.py <<< 28983 1726883010.80321: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726883010.7365482-30480-193233249198864/AnsiballZ_package_facts.py" <<< 28983 1726883010.80393: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmph6d0rcwf" to remote "/root/.ansible/tmp/ansible-tmp-1726883010.7365482-30480-193233249198864/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726883010.7365482-30480-193233249198864/AnsiballZ_package_facts.py" <<< 28983 1726883010.82432: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883010.82468: stderr chunk (state=3): >>><<< 28983 1726883010.82597: stdout chunk (state=3): >>><<< 28983 1726883010.82601: done transferring module to remote 28983 1726883010.82604: _low_level_execute_command(): starting 28983 1726883010.82606: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726883010.7365482-30480-193233249198864/ /root/.ansible/tmp/ansible-tmp-1726883010.7365482-30480-193233249198864/AnsiballZ_package_facts.py && sleep 0' 28983 1726883010.83064: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883010.83083: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address <<< 28983 1726883010.83096: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883010.83170: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883010.83174: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883010.83240: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883010.85245: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883010.85248: stdout chunk (state=3): >>><<< 28983 1726883010.85251: stderr chunk (state=3): >>><<< 28983 1726883010.85299: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883010.85306: _low_level_execute_command(): starting 28983 1726883010.85309: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726883010.7365482-30480-193233249198864/AnsiballZ_package_facts.py && sleep 0' 28983 1726883010.85887: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883010.85940: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883010.85944: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 28983 1726883010.85946: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found <<< 28983 1726883010.85949: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883010.86016: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883010.86028: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883010.86143: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883011.49675: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "r<<< 28983 1726883011.49759: stdout chunk (state=3): >>>pm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, <<< 28983 1726883011.49842: stdout chunk (state=3): >>>"arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "sou<<< 28983 1726883011.49911: stdout chunk (state=3): >>>rce": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces":<<< 28983 1726883011.49918: stdout chunk (state=3): >>> [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "9.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chkconfig": [{"name": "chkconfig", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts": [{"name": "initscripts", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "network-scripts": [{"name": "network-scripts", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 28983 1726883011.51917: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. <<< 28983 1726883011.51921: stdout chunk (state=3): >>><<< 28983 1726883011.51924: stderr chunk (state=3): >>><<< 28983 1726883011.51941: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "9.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chkconfig": [{"name": "chkconfig", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts": [{"name": "initscripts", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "network-scripts": [{"name": "network-scripts", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726883011.63813: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726883010.7365482-30480-193233249198864/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726883011.63819: _low_level_execute_command(): starting 28983 1726883011.63822: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726883010.7365482-30480-193233249198864/ > /dev/null 2>&1 && sleep 0' 28983 1726883011.64220: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883011.64229: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883011.64248: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883011.64266: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883011.64280: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726883011.64289: stderr chunk (state=3): >>>debug2: match not found <<< 28983 1726883011.64299: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883011.64314: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28983 1726883011.64323: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.46.139 is address <<< 28983 1726883011.64332: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28983 1726883011.64343: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883011.64354: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883011.64367: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883011.64376: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726883011.64384: stderr chunk (state=3): >>>debug2: match found <<< 28983 1726883011.64475: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883011.64591: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883011.64595: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883011.66609: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883011.66913: stderr chunk (state=3): >>><<< 28983 1726883011.66916: stdout chunk (state=3): >>><<< 28983 1726883011.66936: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883011.66943: handler run complete 28983 1726883011.69915: variable 'ansible_facts' from source: unknown 28983 1726883011.71012: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883011.74982: variable 'ansible_facts' from source: unknown 28983 1726883011.76040: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883011.78764: attempt loop complete, returning result 28983 1726883011.78790: _execute() done 28983 1726883011.78793: dumping result to json 28983 1726883011.79536: done dumping result, returning 28983 1726883011.79546: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affe814-3a2d-b16d-c0a7-000000000b96] 28983 1726883011.79549: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000b96 28983 1726883011.91882: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000b96 28983 1726883011.91886: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28983 1726883011.92005: no more pending results, returning what we have 28983 1726883011.92008: results queue empty 28983 1726883011.92009: checking for any_errors_fatal 28983 1726883011.92014: done checking for any_errors_fatal 28983 1726883011.92015: checking for max_fail_percentage 28983 1726883011.92016: done checking for max_fail_percentage 28983 1726883011.92017: checking to see if all hosts have failed and the running result is not ok 28983 1726883011.92018: done checking to see if all hosts have failed 28983 1726883011.92019: getting the remaining hosts for this loop 28983 1726883011.92021: done getting the remaining hosts for this loop 28983 1726883011.92024: getting the next task for host managed_node2 28983 1726883011.92032: done getting next task for host managed_node2 28983 1726883011.92036: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 28983 1726883011.92043: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883011.92055: getting variables 28983 1726883011.92056: in VariableManager get_vars() 28983 1726883011.92079: Calling all_inventory to load vars for managed_node2 28983 1726883011.92083: Calling groups_inventory to load vars for managed_node2 28983 1726883011.92086: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883011.92098: Calling all_plugins_play to load vars for managed_node2 28983 1726883011.92102: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883011.92105: Calling groups_plugins_play to load vars for managed_node2 28983 1726883011.95932: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883012.01910: done with get_vars() 28983 1726883012.02152: done getting variables 28983 1726883012.02214: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:43:32 -0400 (0:00:01.344) 0:00:42.022 ****** 28983 1726883012.02457: entering _queue_task() for managed_node2/debug 28983 1726883012.03021: worker is 1 (out of 1 available) 28983 1726883012.03032: exiting _queue_task() for managed_node2/debug 28983 1726883012.03047: done queuing things up, now waiting for results queue to drain 28983 1726883012.03048: waiting for pending results... 28983 1726883012.03656: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 28983 1726883012.04141: in run() - task 0affe814-3a2d-b16d-c0a7-000000000b34 28983 1726883012.04147: variable 'ansible_search_path' from source: unknown 28983 1726883012.04150: variable 'ansible_search_path' from source: unknown 28983 1726883012.04153: calling self._execute() 28983 1726883012.04321: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883012.04336: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883012.04353: variable 'omit' from source: magic vars 28983 1726883012.05267: variable 'ansible_distribution_major_version' from source: facts 28983 1726883012.05318: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883012.05331: variable 'omit' from source: magic vars 28983 1726883012.05490: variable 'omit' from source: magic vars 28983 1726883012.05750: variable 'network_provider' from source: set_fact 28983 1726883012.05777: variable 'omit' from source: magic vars 28983 1726883012.06140: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883012.06144: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883012.06147: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883012.06150: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883012.06152: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883012.06155: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883012.06157: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883012.06159: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883012.06447: Set connection var ansible_connection to ssh 28983 1726883012.06466: Set connection var ansible_shell_executable to /bin/sh 28983 1726883012.06738: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883012.06742: Set connection var ansible_timeout to 10 28983 1726883012.06744: Set connection var ansible_pipelining to False 28983 1726883012.06747: Set connection var ansible_shell_type to sh 28983 1726883012.06749: variable 'ansible_shell_executable' from source: unknown 28983 1726883012.06751: variable 'ansible_connection' from source: unknown 28983 1726883012.06754: variable 'ansible_module_compression' from source: unknown 28983 1726883012.06756: variable 'ansible_shell_type' from source: unknown 28983 1726883012.06758: variable 'ansible_shell_executable' from source: unknown 28983 1726883012.06761: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883012.06763: variable 'ansible_pipelining' from source: unknown 28983 1726883012.06765: variable 'ansible_timeout' from source: unknown 28983 1726883012.06767: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883012.07129: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883012.07339: variable 'omit' from source: magic vars 28983 1726883012.07343: starting attempt loop 28983 1726883012.07346: running the handler 28983 1726883012.07350: handler run complete 28983 1726883012.07352: attempt loop complete, returning result 28983 1726883012.07355: _execute() done 28983 1726883012.07357: dumping result to json 28983 1726883012.07359: done dumping result, returning 28983 1726883012.07362: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [0affe814-3a2d-b16d-c0a7-000000000b34] 28983 1726883012.07365: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000b34 ok: [managed_node2] => {} MSG: Using network provider: nm 28983 1726883012.07716: no more pending results, returning what we have 28983 1726883012.07720: results queue empty 28983 1726883012.07721: checking for any_errors_fatal 28983 1726883012.07740: done checking for any_errors_fatal 28983 1726883012.07741: checking for max_fail_percentage 28983 1726883012.07743: done checking for max_fail_percentage 28983 1726883012.07744: checking to see if all hosts have failed and the running result is not ok 28983 1726883012.07745: done checking to see if all hosts have failed 28983 1726883012.07746: getting the remaining hosts for this loop 28983 1726883012.07748: done getting the remaining hosts for this loop 28983 1726883012.07753: getting the next task for host managed_node2 28983 1726883012.07762: done getting next task for host managed_node2 28983 1726883012.07766: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 28983 1726883012.07775: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883012.07788: getting variables 28983 1726883012.07790: in VariableManager get_vars() 28983 1726883012.07943: Calling all_inventory to load vars for managed_node2 28983 1726883012.07947: Calling groups_inventory to load vars for managed_node2 28983 1726883012.07950: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883012.07956: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000b34 28983 1726883012.07960: WORKER PROCESS EXITING 28983 1726883012.07969: Calling all_plugins_play to load vars for managed_node2 28983 1726883012.07975: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883012.07979: Calling groups_plugins_play to load vars for managed_node2 28983 1726883012.12910: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883012.19556: done with get_vars() 28983 1726883012.19603: done getting variables 28983 1726883012.19687: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:43:32 -0400 (0:00:00.172) 0:00:42.195 ****** 28983 1726883012.19858: entering _queue_task() for managed_node2/fail 28983 1726883012.20652: worker is 1 (out of 1 available) 28983 1726883012.20664: exiting _queue_task() for managed_node2/fail 28983 1726883012.20679: done queuing things up, now waiting for results queue to drain 28983 1726883012.20681: waiting for pending results... 28983 1726883012.21442: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 28983 1726883012.21865: in run() - task 0affe814-3a2d-b16d-c0a7-000000000b35 28983 1726883012.22060: variable 'ansible_search_path' from source: unknown 28983 1726883012.22070: variable 'ansible_search_path' from source: unknown 28983 1726883012.22118: calling self._execute() 28983 1726883012.22232: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883012.22640: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883012.22644: variable 'omit' from source: magic vars 28983 1726883012.23309: variable 'ansible_distribution_major_version' from source: facts 28983 1726883012.23330: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883012.23691: variable 'network_state' from source: role '' defaults 28983 1726883012.23710: Evaluated conditional (network_state != {}): False 28983 1726883012.23720: when evaluation is False, skipping this task 28983 1726883012.23727: _execute() done 28983 1726883012.23737: dumping result to json 28983 1726883012.23747: done dumping result, returning 28983 1726883012.23759: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affe814-3a2d-b16d-c0a7-000000000b35] 28983 1726883012.23770: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000b35 28983 1726883012.23892: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000b35 28983 1726883012.23900: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28983 1726883012.23965: no more pending results, returning what we have 28983 1726883012.23969: results queue empty 28983 1726883012.23970: checking for any_errors_fatal 28983 1726883012.23980: done checking for any_errors_fatal 28983 1726883012.23981: checking for max_fail_percentage 28983 1726883012.23983: done checking for max_fail_percentage 28983 1726883012.23983: checking to see if all hosts have failed and the running result is not ok 28983 1726883012.23984: done checking to see if all hosts have failed 28983 1726883012.23985: getting the remaining hosts for this loop 28983 1726883012.23987: done getting the remaining hosts for this loop 28983 1726883012.23992: getting the next task for host managed_node2 28983 1726883012.24006: done getting next task for host managed_node2 28983 1726883012.24011: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 28983 1726883012.24018: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883012.24044: getting variables 28983 1726883012.24046: in VariableManager get_vars() 28983 1726883012.24089: Calling all_inventory to load vars for managed_node2 28983 1726883012.24092: Calling groups_inventory to load vars for managed_node2 28983 1726883012.24095: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883012.24105: Calling all_plugins_play to load vars for managed_node2 28983 1726883012.24108: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883012.24228: Calling groups_plugins_play to load vars for managed_node2 28983 1726883012.28937: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883012.33923: done with get_vars() 28983 1726883012.33963: done getting variables 28983 1726883012.34047: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:43:32 -0400 (0:00:00.143) 0:00:42.338 ****** 28983 1726883012.34094: entering _queue_task() for managed_node2/fail 28983 1726883012.34578: worker is 1 (out of 1 available) 28983 1726883012.34593: exiting _queue_task() for managed_node2/fail 28983 1726883012.34606: done queuing things up, now waiting for results queue to drain 28983 1726883012.34608: waiting for pending results... 28983 1726883012.34888: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 28983 1726883012.35088: in run() - task 0affe814-3a2d-b16d-c0a7-000000000b36 28983 1726883012.35112: variable 'ansible_search_path' from source: unknown 28983 1726883012.35127: variable 'ansible_search_path' from source: unknown 28983 1726883012.35175: calling self._execute() 28983 1726883012.35301: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883012.35314: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883012.35330: variable 'omit' from source: magic vars 28983 1726883012.35810: variable 'ansible_distribution_major_version' from source: facts 28983 1726883012.35828: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883012.36008: variable 'network_state' from source: role '' defaults 28983 1726883012.36024: Evaluated conditional (network_state != {}): False 28983 1726883012.36033: when evaluation is False, skipping this task 28983 1726883012.36042: _execute() done 28983 1726883012.36050: dumping result to json 28983 1726883012.36057: done dumping result, returning 28983 1726883012.36068: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affe814-3a2d-b16d-c0a7-000000000b36] 28983 1726883012.36083: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000b36 skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28983 1726883012.36276: no more pending results, returning what we have 28983 1726883012.36281: results queue empty 28983 1726883012.36282: checking for any_errors_fatal 28983 1726883012.36292: done checking for any_errors_fatal 28983 1726883012.36293: checking for max_fail_percentage 28983 1726883012.36296: done checking for max_fail_percentage 28983 1726883012.36297: checking to see if all hosts have failed and the running result is not ok 28983 1726883012.36298: done checking to see if all hosts have failed 28983 1726883012.36298: getting the remaining hosts for this loop 28983 1726883012.36301: done getting the remaining hosts for this loop 28983 1726883012.36307: getting the next task for host managed_node2 28983 1726883012.36329: done getting next task for host managed_node2 28983 1726883012.36536: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 28983 1726883012.36544: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883012.36569: getting variables 28983 1726883012.36570: in VariableManager get_vars() 28983 1726883012.36614: Calling all_inventory to load vars for managed_node2 28983 1726883012.36618: Calling groups_inventory to load vars for managed_node2 28983 1726883012.36621: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883012.36633: Calling all_plugins_play to load vars for managed_node2 28983 1726883012.36644: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883012.36650: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000b36 28983 1726883012.36654: WORKER PROCESS EXITING 28983 1726883012.36658: Calling groups_plugins_play to load vars for managed_node2 28983 1726883012.39095: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883012.45724: done with get_vars() 28983 1726883012.45893: done getting variables 28983 1726883012.46099: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:43:32 -0400 (0:00:00.120) 0:00:42.459 ****** 28983 1726883012.46147: entering _queue_task() for managed_node2/fail 28983 1726883012.47161: worker is 1 (out of 1 available) 28983 1726883012.47178: exiting _queue_task() for managed_node2/fail 28983 1726883012.47191: done queuing things up, now waiting for results queue to drain 28983 1726883012.47193: waiting for pending results... 28983 1726883012.47744: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 28983 1726883012.48140: in run() - task 0affe814-3a2d-b16d-c0a7-000000000b37 28983 1726883012.48165: variable 'ansible_search_path' from source: unknown 28983 1726883012.48174: variable 'ansible_search_path' from source: unknown 28983 1726883012.48220: calling self._execute() 28983 1726883012.48641: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883012.48644: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883012.48648: variable 'omit' from source: magic vars 28983 1726883012.49411: variable 'ansible_distribution_major_version' from source: facts 28983 1726883012.49527: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883012.50440: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883012.56442: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883012.56448: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883012.56451: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883012.57014: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883012.57018: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883012.57022: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883012.58028: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883012.58176: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883012.58394: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883012.58418: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883012.58543: variable 'ansible_distribution_major_version' from source: facts 28983 1726883012.58764: Evaluated conditional (ansible_distribution_major_version | int > 9): True 28983 1726883012.58922: variable 'ansible_distribution' from source: facts 28983 1726883012.59088: variable '__network_rh_distros' from source: role '' defaults 28983 1726883012.59106: Evaluated conditional (ansible_distribution in __network_rh_distros): False 28983 1726883012.59339: when evaluation is False, skipping this task 28983 1726883012.59343: _execute() done 28983 1726883012.59346: dumping result to json 28983 1726883012.59349: done dumping result, returning 28983 1726883012.59352: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affe814-3a2d-b16d-c0a7-000000000b37] 28983 1726883012.59354: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000b37 28983 1726883012.59439: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000b37 28983 1726883012.59443: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution in __network_rh_distros", "skip_reason": "Conditional result was False" } 28983 1726883012.59506: no more pending results, returning what we have 28983 1726883012.59523: results queue empty 28983 1726883012.59524: checking for any_errors_fatal 28983 1726883012.59531: done checking for any_errors_fatal 28983 1726883012.59532: checking for max_fail_percentage 28983 1726883012.59536: done checking for max_fail_percentage 28983 1726883012.59537: checking to see if all hosts have failed and the running result is not ok 28983 1726883012.59538: done checking to see if all hosts have failed 28983 1726883012.59539: getting the remaining hosts for this loop 28983 1726883012.59541: done getting the remaining hosts for this loop 28983 1726883012.59546: getting the next task for host managed_node2 28983 1726883012.59557: done getting next task for host managed_node2 28983 1726883012.59565: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 28983 1726883012.59574: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883012.59599: getting variables 28983 1726883012.59601: in VariableManager get_vars() 28983 1726883012.59770: Calling all_inventory to load vars for managed_node2 28983 1726883012.59773: Calling groups_inventory to load vars for managed_node2 28983 1726883012.59777: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883012.59787: Calling all_plugins_play to load vars for managed_node2 28983 1726883012.59791: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883012.59795: Calling groups_plugins_play to load vars for managed_node2 28983 1726883012.62778: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883012.68391: done with get_vars() 28983 1726883012.68443: done getting variables 28983 1726883012.68519: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:43:32 -0400 (0:00:00.225) 0:00:42.684 ****** 28983 1726883012.68691: entering _queue_task() for managed_node2/dnf 28983 1726883012.69122: worker is 1 (out of 1 available) 28983 1726883012.69143: exiting _queue_task() for managed_node2/dnf 28983 1726883012.69159: done queuing things up, now waiting for results queue to drain 28983 1726883012.69161: waiting for pending results... 28983 1726883012.69620: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 28983 1726883012.69926: in run() - task 0affe814-3a2d-b16d-c0a7-000000000b38 28983 1726883012.69949: variable 'ansible_search_path' from source: unknown 28983 1726883012.69958: variable 'ansible_search_path' from source: unknown 28983 1726883012.70006: calling self._execute() 28983 1726883012.70123: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883012.70138: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883012.70155: variable 'omit' from source: magic vars 28983 1726883012.70596: variable 'ansible_distribution_major_version' from source: facts 28983 1726883012.70615: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883012.70903: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883012.73927: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883012.74015: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883012.74069: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883012.74119: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883012.74156: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883012.74258: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883012.74317: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883012.74357: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883012.74418: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883012.74443: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883012.74596: variable 'ansible_distribution' from source: facts 28983 1726883012.74612: variable 'ansible_distribution_major_version' from source: facts 28983 1726883012.74625: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 28983 1726883012.74785: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883012.74979: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883012.75041: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883012.75052: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883012.75108: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883012.75130: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883012.75191: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883012.75241: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883012.75268: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883012.75366: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883012.75369: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883012.75401: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883012.75440: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883012.75474: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883012.75523: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883012.75545: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883012.75941: variable 'network_connections' from source: include params 28983 1726883012.75944: variable 'interface' from source: play vars 28983 1726883012.75955: variable 'interface' from source: play vars 28983 1726883012.76061: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883012.76359: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883012.76428: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883012.76479: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883012.76515: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883012.76583: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883012.76611: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883012.76704: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883012.76707: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883012.76760: variable '__network_team_connections_defined' from source: role '' defaults 28983 1726883012.77166: variable 'network_connections' from source: include params 28983 1726883012.77172: variable 'interface' from source: play vars 28983 1726883012.77270: variable 'interface' from source: play vars 28983 1726883012.77336: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 28983 1726883012.77340: when evaluation is False, skipping this task 28983 1726883012.77343: _execute() done 28983 1726883012.77348: dumping result to json 28983 1726883012.77468: done dumping result, returning 28983 1726883012.77471: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affe814-3a2d-b16d-c0a7-000000000b38] 28983 1726883012.77474: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000b38 28983 1726883012.77548: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000b38 28983 1726883012.77551: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 28983 1726883012.77629: no more pending results, returning what we have 28983 1726883012.77633: results queue empty 28983 1726883012.77635: checking for any_errors_fatal 28983 1726883012.77643: done checking for any_errors_fatal 28983 1726883012.77644: checking for max_fail_percentage 28983 1726883012.77646: done checking for max_fail_percentage 28983 1726883012.77647: checking to see if all hosts have failed and the running result is not ok 28983 1726883012.77648: done checking to see if all hosts have failed 28983 1726883012.77649: getting the remaining hosts for this loop 28983 1726883012.77651: done getting the remaining hosts for this loop 28983 1726883012.77656: getting the next task for host managed_node2 28983 1726883012.77666: done getting next task for host managed_node2 28983 1726883012.77671: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 28983 1726883012.77677: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883012.77698: getting variables 28983 1726883012.77700: in VariableManager get_vars() 28983 1726883012.77841: Calling all_inventory to load vars for managed_node2 28983 1726883012.77845: Calling groups_inventory to load vars for managed_node2 28983 1726883012.77848: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883012.77858: Calling all_plugins_play to load vars for managed_node2 28983 1726883012.77862: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883012.77866: Calling groups_plugins_play to load vars for managed_node2 28983 1726883012.80001: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883012.81764: done with get_vars() 28983 1726883012.81839: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 28983 1726883012.81929: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:43:32 -0400 (0:00:00.132) 0:00:42.817 ****** 28983 1726883012.81972: entering _queue_task() for managed_node2/yum 28983 1726883012.82362: worker is 1 (out of 1 available) 28983 1726883012.82375: exiting _queue_task() for managed_node2/yum 28983 1726883012.82397: done queuing things up, now waiting for results queue to drain 28983 1726883012.82399: waiting for pending results... 28983 1726883012.82803: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 28983 1726883012.82935: in run() - task 0affe814-3a2d-b16d-c0a7-000000000b39 28983 1726883012.82951: variable 'ansible_search_path' from source: unknown 28983 1726883012.82955: variable 'ansible_search_path' from source: unknown 28983 1726883012.82991: calling self._execute() 28983 1726883012.83073: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883012.83082: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883012.83092: variable 'omit' from source: magic vars 28983 1726883012.83414: variable 'ansible_distribution_major_version' from source: facts 28983 1726883012.83425: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883012.83581: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883012.85718: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883012.85723: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883012.85749: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883012.85793: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883012.85824: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883012.85921: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883012.85967: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883012.86001: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883012.86053: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883012.86071: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883012.86186: variable 'ansible_distribution_major_version' from source: facts 28983 1726883012.86203: Evaluated conditional (ansible_distribution_major_version | int < 8): False 28983 1726883012.86207: when evaluation is False, skipping this task 28983 1726883012.86209: _execute() done 28983 1726883012.86215: dumping result to json 28983 1726883012.86220: done dumping result, returning 28983 1726883012.86229: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affe814-3a2d-b16d-c0a7-000000000b39] 28983 1726883012.86263: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000b39 28983 1726883012.86341: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000b39 28983 1726883012.86344: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 28983 1726883012.86425: no more pending results, returning what we have 28983 1726883012.86428: results queue empty 28983 1726883012.86429: checking for any_errors_fatal 28983 1726883012.86440: done checking for any_errors_fatal 28983 1726883012.86441: checking for max_fail_percentage 28983 1726883012.86443: done checking for max_fail_percentage 28983 1726883012.86444: checking to see if all hosts have failed and the running result is not ok 28983 1726883012.86445: done checking to see if all hosts have failed 28983 1726883012.86446: getting the remaining hosts for this loop 28983 1726883012.86448: done getting the remaining hosts for this loop 28983 1726883012.86453: getting the next task for host managed_node2 28983 1726883012.86462: done getting next task for host managed_node2 28983 1726883012.86467: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 28983 1726883012.86474: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883012.86495: getting variables 28983 1726883012.86497: in VariableManager get_vars() 28983 1726883012.86531: Calling all_inventory to load vars for managed_node2 28983 1726883012.86617: Calling groups_inventory to load vars for managed_node2 28983 1726883012.86621: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883012.86631: Calling all_plugins_play to load vars for managed_node2 28983 1726883012.86637: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883012.86641: Calling groups_plugins_play to load vars for managed_node2 28983 1726883012.88843: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883012.91882: done with get_vars() 28983 1726883012.91925: done getting variables 28983 1726883012.92008: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:43:32 -0400 (0:00:00.100) 0:00:42.918 ****** 28983 1726883012.92058: entering _queue_task() for managed_node2/fail 28983 1726883012.92666: worker is 1 (out of 1 available) 28983 1726883012.92679: exiting _queue_task() for managed_node2/fail 28983 1726883012.92690: done queuing things up, now waiting for results queue to drain 28983 1726883012.92692: waiting for pending results... 28983 1726883012.92931: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 28983 1726883012.92984: in run() - task 0affe814-3a2d-b16d-c0a7-000000000b3a 28983 1726883012.93009: variable 'ansible_search_path' from source: unknown 28983 1726883012.93017: variable 'ansible_search_path' from source: unknown 28983 1726883012.93067: calling self._execute() 28983 1726883012.93189: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883012.93202: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883012.93220: variable 'omit' from source: magic vars 28983 1726883012.93662: variable 'ansible_distribution_major_version' from source: facts 28983 1726883012.93690: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883012.93848: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883012.94228: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883012.96946: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883012.97033: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883012.97089: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883012.97174: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883012.97180: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883012.97288: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883012.97760: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883012.97801: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883012.97935: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883012.97939: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883012.97952: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883012.97989: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883012.98025: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883012.98090: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883012.98113: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883012.98178: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883012.98212: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883012.98250: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883012.98310: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883012.98332: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883012.98568: variable 'network_connections' from source: include params 28983 1726883012.98739: variable 'interface' from source: play vars 28983 1726883012.98743: variable 'interface' from source: play vars 28983 1726883012.98786: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883012.99006: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883012.99061: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883012.99110: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883012.99153: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883012.99215: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883012.99251: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883012.99298: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883012.99338: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883012.99416: variable '__network_team_connections_defined' from source: role '' defaults 28983 1726883012.99778: variable 'network_connections' from source: include params 28983 1726883012.99790: variable 'interface' from source: play vars 28983 1726883012.99875: variable 'interface' from source: play vars 28983 1726883012.99917: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 28983 1726883012.99927: when evaluation is False, skipping this task 28983 1726883012.99951: _execute() done 28983 1726883012.99955: dumping result to json 28983 1726883012.99957: done dumping result, returning 28983 1726883013.00061: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affe814-3a2d-b16d-c0a7-000000000b3a] 28983 1726883013.00064: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000b3a 28983 1726883013.00153: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000b3a 28983 1726883013.00157: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 28983 1726883013.00226: no more pending results, returning what we have 28983 1726883013.00230: results queue empty 28983 1726883013.00231: checking for any_errors_fatal 28983 1726883013.00241: done checking for any_errors_fatal 28983 1726883013.00242: checking for max_fail_percentage 28983 1726883013.00244: done checking for max_fail_percentage 28983 1726883013.00245: checking to see if all hosts have failed and the running result is not ok 28983 1726883013.00246: done checking to see if all hosts have failed 28983 1726883013.00247: getting the remaining hosts for this loop 28983 1726883013.00249: done getting the remaining hosts for this loop 28983 1726883013.00255: getting the next task for host managed_node2 28983 1726883013.00266: done getting next task for host managed_node2 28983 1726883013.00273: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 28983 1726883013.00279: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883013.00304: getting variables 28983 1726883013.00305: in VariableManager get_vars() 28983 1726883013.00453: Calling all_inventory to load vars for managed_node2 28983 1726883013.00457: Calling groups_inventory to load vars for managed_node2 28983 1726883013.00460: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883013.00474: Calling all_plugins_play to load vars for managed_node2 28983 1726883013.00478: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883013.00483: Calling groups_plugins_play to load vars for managed_node2 28983 1726883013.03166: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883013.06145: done with get_vars() 28983 1726883013.06190: done getting variables 28983 1726883013.06263: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:43:33 -0400 (0:00:00.142) 0:00:43.060 ****** 28983 1726883013.06309: entering _queue_task() for managed_node2/package 28983 1726883013.06692: worker is 1 (out of 1 available) 28983 1726883013.06707: exiting _queue_task() for managed_node2/package 28983 1726883013.06721: done queuing things up, now waiting for results queue to drain 28983 1726883013.06723: waiting for pending results... 28983 1726883013.07155: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 28983 1726883013.07264: in run() - task 0affe814-3a2d-b16d-c0a7-000000000b3b 28983 1726883013.07290: variable 'ansible_search_path' from source: unknown 28983 1726883013.07298: variable 'ansible_search_path' from source: unknown 28983 1726883013.07344: calling self._execute() 28983 1726883013.07455: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883013.07475: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883013.07493: variable 'omit' from source: magic vars 28983 1726883013.07946: variable 'ansible_distribution_major_version' from source: facts 28983 1726883013.07966: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883013.08224: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883013.08558: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883013.08620: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883013.08665: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883013.08756: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883013.08902: variable 'network_packages' from source: role '' defaults 28983 1726883013.09048: variable '__network_provider_setup' from source: role '' defaults 28983 1726883013.09066: variable '__network_service_name_default_nm' from source: role '' defaults 28983 1726883013.09155: variable '__network_service_name_default_nm' from source: role '' defaults 28983 1726883013.09170: variable '__network_packages_default_nm' from source: role '' defaults 28983 1726883013.09329: variable '__network_packages_default_nm' from source: role '' defaults 28983 1726883013.09533: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883013.11302: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883013.11353: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883013.11388: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883013.11417: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883013.11442: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883013.11513: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883013.11538: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883013.11560: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883013.11602: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883013.11614: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883013.11657: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883013.11679: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883013.11705: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883013.11764: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883013.11768: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883013.12245: variable '__network_packages_default_gobject_packages' from source: role '' defaults 28983 1726883013.12249: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883013.12255: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883013.12260: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883013.12309: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883013.12325: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883013.12437: variable 'ansible_python' from source: facts 28983 1726883013.12457: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 28983 1726883013.12561: variable '__network_wpa_supplicant_required' from source: role '' defaults 28983 1726883013.12661: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 28983 1726883013.12831: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883013.12861: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883013.12906: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883013.12973: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883013.12978: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883013.13016: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883013.13038: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883013.13068: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883013.13106: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883013.13124: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883013.13308: variable 'network_connections' from source: include params 28983 1726883013.13312: variable 'interface' from source: play vars 28983 1726883013.13476: variable 'interface' from source: play vars 28983 1726883013.13503: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883013.13545: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883013.13602: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883013.13651: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883013.13720: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883013.14253: variable 'network_connections' from source: include params 28983 1726883013.14256: variable 'interface' from source: play vars 28983 1726883013.14364: variable 'interface' from source: play vars 28983 1726883013.14424: variable '__network_packages_default_wireless' from source: role '' defaults 28983 1726883013.14507: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883013.14770: variable 'network_connections' from source: include params 28983 1726883013.14777: variable 'interface' from source: play vars 28983 1726883013.14832: variable 'interface' from source: play vars 28983 1726883013.14856: variable '__network_packages_default_team' from source: role '' defaults 28983 1726883013.14925: variable '__network_team_connections_defined' from source: role '' defaults 28983 1726883013.15181: variable 'network_connections' from source: include params 28983 1726883013.15185: variable 'interface' from source: play vars 28983 1726883013.15243: variable 'interface' from source: play vars 28983 1726883013.15298: variable '__network_service_name_default_initscripts' from source: role '' defaults 28983 1726883013.15349: variable '__network_service_name_default_initscripts' from source: role '' defaults 28983 1726883013.15357: variable '__network_packages_default_initscripts' from source: role '' defaults 28983 1726883013.15409: variable '__network_packages_default_initscripts' from source: role '' defaults 28983 1726883013.15594: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 28983 1726883013.15999: variable 'network_connections' from source: include params 28983 1726883013.16003: variable 'interface' from source: play vars 28983 1726883013.16055: variable 'interface' from source: play vars 28983 1726883013.16068: variable 'ansible_distribution' from source: facts 28983 1726883013.16071: variable '__network_rh_distros' from source: role '' defaults 28983 1726883013.16079: variable 'ansible_distribution_major_version' from source: facts 28983 1726883013.16101: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 28983 1726883013.16240: variable 'ansible_distribution' from source: facts 28983 1726883013.16244: variable '__network_rh_distros' from source: role '' defaults 28983 1726883013.16251: variable 'ansible_distribution_major_version' from source: facts 28983 1726883013.16287: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 28983 1726883013.16522: variable 'ansible_distribution' from source: facts 28983 1726883013.16526: variable '__network_rh_distros' from source: role '' defaults 28983 1726883013.16528: variable 'ansible_distribution_major_version' from source: facts 28983 1726883013.16584: variable 'network_provider' from source: set_fact 28983 1726883013.16612: variable 'ansible_facts' from source: unknown 28983 1726883013.17605: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 28983 1726883013.17609: when evaluation is False, skipping this task 28983 1726883013.17613: _execute() done 28983 1726883013.17616: dumping result to json 28983 1726883013.17618: done dumping result, returning 28983 1726883013.17630: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [0affe814-3a2d-b16d-c0a7-000000000b3b] 28983 1726883013.17633: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000b3b 28983 1726883013.17745: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000b3b 28983 1726883013.17748: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 28983 1726883013.17805: no more pending results, returning what we have 28983 1726883013.17809: results queue empty 28983 1726883013.17810: checking for any_errors_fatal 28983 1726883013.17818: done checking for any_errors_fatal 28983 1726883013.17819: checking for max_fail_percentage 28983 1726883013.17820: done checking for max_fail_percentage 28983 1726883013.17821: checking to see if all hosts have failed and the running result is not ok 28983 1726883013.17822: done checking to see if all hosts have failed 28983 1726883013.17823: getting the remaining hosts for this loop 28983 1726883013.17825: done getting the remaining hosts for this loop 28983 1726883013.17831: getting the next task for host managed_node2 28983 1726883013.17841: done getting next task for host managed_node2 28983 1726883013.17847: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 28983 1726883013.17853: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883013.17878: getting variables 28983 1726883013.17879: in VariableManager get_vars() 28983 1726883013.17922: Calling all_inventory to load vars for managed_node2 28983 1726883013.17925: Calling groups_inventory to load vars for managed_node2 28983 1726883013.17928: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883013.17945: Calling all_plugins_play to load vars for managed_node2 28983 1726883013.17949: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883013.17953: Calling groups_plugins_play to load vars for managed_node2 28983 1726883013.19219: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883013.20929: done with get_vars() 28983 1726883013.20953: done getting variables 28983 1726883013.21007: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:43:33 -0400 (0:00:00.147) 0:00:43.208 ****** 28983 1726883013.21037: entering _queue_task() for managed_node2/package 28983 1726883013.21268: worker is 1 (out of 1 available) 28983 1726883013.21286: exiting _queue_task() for managed_node2/package 28983 1726883013.21300: done queuing things up, now waiting for results queue to drain 28983 1726883013.21302: waiting for pending results... 28983 1726883013.21494: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 28983 1726883013.21608: in run() - task 0affe814-3a2d-b16d-c0a7-000000000b3c 28983 1726883013.21623: variable 'ansible_search_path' from source: unknown 28983 1726883013.21627: variable 'ansible_search_path' from source: unknown 28983 1726883013.21663: calling self._execute() 28983 1726883013.21743: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883013.21752: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883013.21764: variable 'omit' from source: magic vars 28983 1726883013.22075: variable 'ansible_distribution_major_version' from source: facts 28983 1726883013.22091: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883013.22196: variable 'network_state' from source: role '' defaults 28983 1726883013.22261: Evaluated conditional (network_state != {}): False 28983 1726883013.22265: when evaluation is False, skipping this task 28983 1726883013.22267: _execute() done 28983 1726883013.22270: dumping result to json 28983 1726883013.22275: done dumping result, returning 28983 1726883013.22278: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affe814-3a2d-b16d-c0a7-000000000b3c] 28983 1726883013.22281: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000b3c 28983 1726883013.22359: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000b3c 28983 1726883013.22362: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28983 1726883013.22442: no more pending results, returning what we have 28983 1726883013.22445: results queue empty 28983 1726883013.22446: checking for any_errors_fatal 28983 1726883013.22452: done checking for any_errors_fatal 28983 1726883013.22453: checking for max_fail_percentage 28983 1726883013.22455: done checking for max_fail_percentage 28983 1726883013.22456: checking to see if all hosts have failed and the running result is not ok 28983 1726883013.22457: done checking to see if all hosts have failed 28983 1726883013.22457: getting the remaining hosts for this loop 28983 1726883013.22459: done getting the remaining hosts for this loop 28983 1726883013.22463: getting the next task for host managed_node2 28983 1726883013.22474: done getting next task for host managed_node2 28983 1726883013.22479: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 28983 1726883013.22485: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883013.22505: getting variables 28983 1726883013.22506: in VariableManager get_vars() 28983 1726883013.22537: Calling all_inventory to load vars for managed_node2 28983 1726883013.22540: Calling groups_inventory to load vars for managed_node2 28983 1726883013.22542: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883013.22548: Calling all_plugins_play to load vars for managed_node2 28983 1726883013.22551: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883013.22553: Calling groups_plugins_play to load vars for managed_node2 28983 1726883013.23782: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883013.25376: done with get_vars() 28983 1726883013.25398: done getting variables 28983 1726883013.25445: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:43:33 -0400 (0:00:00.044) 0:00:43.252 ****** 28983 1726883013.25475: entering _queue_task() for managed_node2/package 28983 1726883013.25678: worker is 1 (out of 1 available) 28983 1726883013.25693: exiting _queue_task() for managed_node2/package 28983 1726883013.25706: done queuing things up, now waiting for results queue to drain 28983 1726883013.25708: waiting for pending results... 28983 1726883013.25908: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 28983 1726883013.26020: in run() - task 0affe814-3a2d-b16d-c0a7-000000000b3d 28983 1726883013.26032: variable 'ansible_search_path' from source: unknown 28983 1726883013.26038: variable 'ansible_search_path' from source: unknown 28983 1726883013.26072: calling self._execute() 28983 1726883013.26157: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883013.26162: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883013.26171: variable 'omit' from source: magic vars 28983 1726883013.26487: variable 'ansible_distribution_major_version' from source: facts 28983 1726883013.26499: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883013.26604: variable 'network_state' from source: role '' defaults 28983 1726883013.26617: Evaluated conditional (network_state != {}): False 28983 1726883013.26620: when evaluation is False, skipping this task 28983 1726883013.26623: _execute() done 28983 1726883013.26626: dumping result to json 28983 1726883013.26631: done dumping result, returning 28983 1726883013.26641: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affe814-3a2d-b16d-c0a7-000000000b3d] 28983 1726883013.26648: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000b3d 28983 1726883013.26751: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000b3d 28983 1726883013.26754: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28983 1726883013.26812: no more pending results, returning what we have 28983 1726883013.26816: results queue empty 28983 1726883013.26817: checking for any_errors_fatal 28983 1726883013.26822: done checking for any_errors_fatal 28983 1726883013.26823: checking for max_fail_percentage 28983 1726883013.26824: done checking for max_fail_percentage 28983 1726883013.26825: checking to see if all hosts have failed and the running result is not ok 28983 1726883013.26826: done checking to see if all hosts have failed 28983 1726883013.26827: getting the remaining hosts for this loop 28983 1726883013.26829: done getting the remaining hosts for this loop 28983 1726883013.26833: getting the next task for host managed_node2 28983 1726883013.26847: done getting next task for host managed_node2 28983 1726883013.26851: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 28983 1726883013.26857: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883013.26881: getting variables 28983 1726883013.26882: in VariableManager get_vars() 28983 1726883013.26913: Calling all_inventory to load vars for managed_node2 28983 1726883013.26915: Calling groups_inventory to load vars for managed_node2 28983 1726883013.26917: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883013.26924: Calling all_plugins_play to load vars for managed_node2 28983 1726883013.26926: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883013.26928: Calling groups_plugins_play to load vars for managed_node2 28983 1726883013.28919: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883013.30505: done with get_vars() 28983 1726883013.30527: done getting variables 28983 1726883013.30577: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:43:33 -0400 (0:00:00.051) 0:00:43.303 ****** 28983 1726883013.30606: entering _queue_task() for managed_node2/service 28983 1726883013.30817: worker is 1 (out of 1 available) 28983 1726883013.30831: exiting _queue_task() for managed_node2/service 28983 1726883013.30846: done queuing things up, now waiting for results queue to drain 28983 1726883013.30848: waiting for pending results... 28983 1726883013.31043: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 28983 1726883013.31153: in run() - task 0affe814-3a2d-b16d-c0a7-000000000b3e 28983 1726883013.31167: variable 'ansible_search_path' from source: unknown 28983 1726883013.31170: variable 'ansible_search_path' from source: unknown 28983 1726883013.31205: calling self._execute() 28983 1726883013.31290: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883013.31295: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883013.31309: variable 'omit' from source: magic vars 28983 1726883013.31626: variable 'ansible_distribution_major_version' from source: facts 28983 1726883013.31639: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883013.31739: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883013.32061: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883013.34774: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883013.34860: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883013.35039: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883013.35043: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883013.35045: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883013.35081: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883013.35137: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883013.35175: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883013.35231: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883013.35256: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883013.35322: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883013.35358: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883013.35395: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883013.35451: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883013.35473: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883013.35528: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883013.35563: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883013.35598: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883013.35657: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883013.35683: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883013.35913: variable 'network_connections' from source: include params 28983 1726883013.35932: variable 'interface' from source: play vars 28983 1726883013.36016: variable 'interface' from source: play vars 28983 1726883013.36109: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883013.36439: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883013.36443: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883013.36445: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883013.36448: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883013.36491: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883013.36521: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883013.36559: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883013.36595: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883013.36672: variable '__network_team_connections_defined' from source: role '' defaults 28983 1726883013.37002: variable 'network_connections' from source: include params 28983 1726883013.37013: variable 'interface' from source: play vars 28983 1726883013.37092: variable 'interface' from source: play vars 28983 1726883013.37135: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 28983 1726883013.37145: when evaluation is False, skipping this task 28983 1726883013.37152: _execute() done 28983 1726883013.37160: dumping result to json 28983 1726883013.37168: done dumping result, returning 28983 1726883013.37179: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affe814-3a2d-b16d-c0a7-000000000b3e] 28983 1726883013.37189: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000b3e skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 28983 1726883013.37364: no more pending results, returning what we have 28983 1726883013.37368: results queue empty 28983 1726883013.37368: checking for any_errors_fatal 28983 1726883013.37381: done checking for any_errors_fatal 28983 1726883013.37382: checking for max_fail_percentage 28983 1726883013.37384: done checking for max_fail_percentage 28983 1726883013.37385: checking to see if all hosts have failed and the running result is not ok 28983 1726883013.37386: done checking to see if all hosts have failed 28983 1726883013.37387: getting the remaining hosts for this loop 28983 1726883013.37389: done getting the remaining hosts for this loop 28983 1726883013.37394: getting the next task for host managed_node2 28983 1726883013.37403: done getting next task for host managed_node2 28983 1726883013.37408: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 28983 1726883013.37414: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883013.37437: getting variables 28983 1726883013.37439: in VariableManager get_vars() 28983 1726883013.37482: Calling all_inventory to load vars for managed_node2 28983 1726883013.37486: Calling groups_inventory to load vars for managed_node2 28983 1726883013.37488: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883013.37498: Calling all_plugins_play to load vars for managed_node2 28983 1726883013.37501: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883013.37505: Calling groups_plugins_play to load vars for managed_node2 28983 1726883013.38051: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000b3e 28983 1726883013.38054: WORKER PROCESS EXITING 28983 1726883013.39765: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883013.42892: done with get_vars() 28983 1726883013.42928: done getting variables 28983 1726883013.42995: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:43:33 -0400 (0:00:00.124) 0:00:43.428 ****** 28983 1726883013.43039: entering _queue_task() for managed_node2/service 28983 1726883013.43383: worker is 1 (out of 1 available) 28983 1726883013.43397: exiting _queue_task() for managed_node2/service 28983 1726883013.43411: done queuing things up, now waiting for results queue to drain 28983 1726883013.43413: waiting for pending results... 28983 1726883013.43723: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 28983 1726883013.43916: in run() - task 0affe814-3a2d-b16d-c0a7-000000000b3f 28983 1726883013.43942: variable 'ansible_search_path' from source: unknown 28983 1726883013.43952: variable 'ansible_search_path' from source: unknown 28983 1726883013.44002: calling self._execute() 28983 1726883013.44123: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883013.44140: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883013.44159: variable 'omit' from source: magic vars 28983 1726883013.44689: variable 'ansible_distribution_major_version' from source: facts 28983 1726883013.44709: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883013.44960: variable 'network_provider' from source: set_fact 28983 1726883013.44981: variable 'network_state' from source: role '' defaults 28983 1726883013.45000: Evaluated conditional (network_provider == "nm" or network_state != {}): True 28983 1726883013.45022: variable 'omit' from source: magic vars 28983 1726883013.45108: variable 'omit' from source: magic vars 28983 1726883013.45163: variable 'network_service_name' from source: role '' defaults 28983 1726883013.45254: variable 'network_service_name' from source: role '' defaults 28983 1726883013.45410: variable '__network_provider_setup' from source: role '' defaults 28983 1726883013.45422: variable '__network_service_name_default_nm' from source: role '' defaults 28983 1726883013.45522: variable '__network_service_name_default_nm' from source: role '' defaults 28983 1726883013.45539: variable '__network_packages_default_nm' from source: role '' defaults 28983 1726883013.45627: variable '__network_packages_default_nm' from source: role '' defaults 28983 1726883013.45957: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883013.48785: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883013.48866: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883013.48920: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883013.48981: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883013.49021: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883013.49122: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883013.49166: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883013.49221: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883013.49281: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883013.49311: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883013.49386: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883013.49424: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883013.49475: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883013.49531: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883013.49572: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883013.49887: variable '__network_packages_default_gobject_packages' from source: role '' defaults 28983 1726883013.50069: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883013.50123: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883013.50339: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883013.50343: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883013.50346: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883013.50348: variable 'ansible_python' from source: facts 28983 1726883013.50366: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 28983 1726883013.50473: variable '__network_wpa_supplicant_required' from source: role '' defaults 28983 1726883013.50594: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 28983 1726883013.50781: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883013.50824: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883013.50873: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883013.50932: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883013.50965: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883013.51033: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883013.51078: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883013.51117: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883013.51176: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883013.51199: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883013.51395: variable 'network_connections' from source: include params 28983 1726883013.51409: variable 'interface' from source: play vars 28983 1726883013.51503: variable 'interface' from source: play vars 28983 1726883013.51643: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883013.51878: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883013.51945: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883013.52018: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883013.52087: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883013.52197: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883013.52240: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883013.52290: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883013.52360: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883013.52421: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883013.52866: variable 'network_connections' from source: include params 28983 1726883013.52904: variable 'interface' from source: play vars 28983 1726883013.53025: variable 'interface' from source: play vars 28983 1726883013.53072: variable '__network_packages_default_wireless' from source: role '' defaults 28983 1726883013.53252: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883013.53689: variable 'network_connections' from source: include params 28983 1726883013.53777: variable 'interface' from source: play vars 28983 1726883013.53793: variable 'interface' from source: play vars 28983 1726883013.53825: variable '__network_packages_default_team' from source: role '' defaults 28983 1726883013.53930: variable '__network_team_connections_defined' from source: role '' defaults 28983 1726883013.54338: variable 'network_connections' from source: include params 28983 1726883013.54350: variable 'interface' from source: play vars 28983 1726883013.54440: variable 'interface' from source: play vars 28983 1726883013.54517: variable '__network_service_name_default_initscripts' from source: role '' defaults 28983 1726883013.54599: variable '__network_service_name_default_initscripts' from source: role '' defaults 28983 1726883013.54612: variable '__network_packages_default_initscripts' from source: role '' defaults 28983 1726883013.54697: variable '__network_packages_default_initscripts' from source: role '' defaults 28983 1726883013.55013: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 28983 1726883013.55699: variable 'network_connections' from source: include params 28983 1726883013.55724: variable 'interface' from source: play vars 28983 1726883013.55787: variable 'interface' from source: play vars 28983 1726883013.55803: variable 'ansible_distribution' from source: facts 28983 1726883013.55807: variable '__network_rh_distros' from source: role '' defaults 28983 1726883013.55811: variable 'ansible_distribution_major_version' from source: facts 28983 1726883013.55840: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 28983 1726883013.56010: variable 'ansible_distribution' from source: facts 28983 1726883013.56014: variable '__network_rh_distros' from source: role '' defaults 28983 1726883013.56022: variable 'ansible_distribution_major_version' from source: facts 28983 1726883013.56029: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 28983 1726883013.56182: variable 'ansible_distribution' from source: facts 28983 1726883013.56185: variable '__network_rh_distros' from source: role '' defaults 28983 1726883013.56192: variable 'ansible_distribution_major_version' from source: facts 28983 1726883013.56221: variable 'network_provider' from source: set_fact 28983 1726883013.56243: variable 'omit' from source: magic vars 28983 1726883013.56274: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883013.56296: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883013.56314: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883013.56329: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883013.56340: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883013.56370: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883013.56376: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883013.56378: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883013.56457: Set connection var ansible_connection to ssh 28983 1726883013.56470: Set connection var ansible_shell_executable to /bin/sh 28983 1726883013.56479: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883013.56489: Set connection var ansible_timeout to 10 28983 1726883013.56495: Set connection var ansible_pipelining to False 28983 1726883013.56498: Set connection var ansible_shell_type to sh 28983 1726883013.56519: variable 'ansible_shell_executable' from source: unknown 28983 1726883013.56522: variable 'ansible_connection' from source: unknown 28983 1726883013.56525: variable 'ansible_module_compression' from source: unknown 28983 1726883013.56529: variable 'ansible_shell_type' from source: unknown 28983 1726883013.56531: variable 'ansible_shell_executable' from source: unknown 28983 1726883013.56539: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883013.56542: variable 'ansible_pipelining' from source: unknown 28983 1726883013.56547: variable 'ansible_timeout' from source: unknown 28983 1726883013.56552: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883013.56636: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883013.56648: variable 'omit' from source: magic vars 28983 1726883013.56654: starting attempt loop 28983 1726883013.56656: running the handler 28983 1726883013.56724: variable 'ansible_facts' from source: unknown 28983 1726883013.57645: _low_level_execute_command(): starting 28983 1726883013.57648: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726883013.59043: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883013.59214: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883013.59339: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883013.61110: stdout chunk (state=3): >>>/root <<< 28983 1726883013.61215: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883013.61266: stderr chunk (state=3): >>><<< 28983 1726883013.61270: stdout chunk (state=3): >>><<< 28983 1726883013.61291: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883013.61304: _low_level_execute_command(): starting 28983 1726883013.61312: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726883013.6129196-30556-273765168497190 `" && echo ansible-tmp-1726883013.6129196-30556-273765168497190="` echo /root/.ansible/tmp/ansible-tmp-1726883013.6129196-30556-273765168497190 `" ) && sleep 0' 28983 1726883013.61736: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883013.61739: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726883013.61744: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 28983 1726883013.61747: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883013.61749: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883013.61800: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883013.61803: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883013.61885: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883013.63896: stdout chunk (state=3): >>>ansible-tmp-1726883013.6129196-30556-273765168497190=/root/.ansible/tmp/ansible-tmp-1726883013.6129196-30556-273765168497190 <<< 28983 1726883013.64011: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883013.64065: stderr chunk (state=3): >>><<< 28983 1726883013.64067: stdout chunk (state=3): >>><<< 28983 1726883013.64141: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726883013.6129196-30556-273765168497190=/root/.ansible/tmp/ansible-tmp-1726883013.6129196-30556-273765168497190 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883013.64145: variable 'ansible_module_compression' from source: unknown 28983 1726883013.64148: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 28983 1726883013.64210: variable 'ansible_facts' from source: unknown 28983 1726883013.64357: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726883013.6129196-30556-273765168497190/AnsiballZ_systemd.py 28983 1726883013.64477: Sending initial data 28983 1726883013.64482: Sent initial data (156 bytes) 28983 1726883013.64940: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883013.64943: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726883013.64946: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883013.64973: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found <<< 28983 1726883013.64977: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883013.65028: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883013.65099: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883013.66774: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726883013.66828: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726883013.66895: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmp7dpa5l6c /root/.ansible/tmp/ansible-tmp-1726883013.6129196-30556-273765168497190/AnsiballZ_systemd.py <<< 28983 1726883013.66898: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726883013.6129196-30556-273765168497190/AnsiballZ_systemd.py" <<< 28983 1726883013.66961: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmp7dpa5l6c" to remote "/root/.ansible/tmp/ansible-tmp-1726883013.6129196-30556-273765168497190/AnsiballZ_systemd.py" <<< 28983 1726883013.66973: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726883013.6129196-30556-273765168497190/AnsiballZ_systemd.py" <<< 28983 1726883013.69330: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883013.69388: stderr chunk (state=3): >>><<< 28983 1726883013.69393: stdout chunk (state=3): >>><<< 28983 1726883013.69414: done transferring module to remote 28983 1726883013.69424: _low_level_execute_command(): starting 28983 1726883013.69429: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726883013.6129196-30556-273765168497190/ /root/.ansible/tmp/ansible-tmp-1726883013.6129196-30556-273765168497190/AnsiballZ_systemd.py && sleep 0' 28983 1726883013.69824: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883013.69839: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883013.69843: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883013.69868: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883013.69874: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883013.69925: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883013.69932: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883013.70001: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883013.71895: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883013.71940: stderr chunk (state=3): >>><<< 28983 1726883013.71943: stdout chunk (state=3): >>><<< 28983 1726883013.71957: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883013.71960: _low_level_execute_command(): starting 28983 1726883013.71966: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726883013.6129196-30556-273765168497190/AnsiballZ_systemd.py && sleep 0' 28983 1726883013.72396: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883013.72400: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883013.72402: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883013.72405: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883013.72461: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883013.72465: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883013.72536: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883014.05098: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3307", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ExecMainStartTimestampMonotonic": "237115079", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3307", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4837", "MemoryCurrent": "4444160", "MemoryAvailable": "infinity", "CPUUsageNSec": "1519686000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "<<< 28983 1726883014.05124: stdout chunk (state=3): >>>infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target NetworkManager-wait-online.service cloud-init.service network.target network.service multi-user.target", "After": "systemd-journald.socket dbus-broker.service dbus.socket system.slice cloud-init-local.service sysinit.target network-pre.target basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": <<< 28983 1726883014.05133: stdout chunk (state=3): >>>"loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:40:53 EDT", "StateChangeTimestampMonotonic": "816797668", "InactiveExitTimestamp": "Fri 2024-09-20 21:31:13 EDT", "InactiveExitTimestampMonotonic": "237115438", "ActiveEnterTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ActiveEnterTimestampMonotonic": "237210316", "ActiveExitTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ActiveExitTimestampMonotonic": "237082173", "InactiveEnterTimestamp": "Fri 2024-09-20 21:31:13 EDT", "InactiveEnterTimestampMonotonic": "237109124", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ConditionTimestampMonotonic": "237109810", "AssertTimestamp": "Fri 2024-09-20 21:31:13 EDT", "AssertTimestampMonotonic": "237109813", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ac5f6302898c463683c4bdf580ab7e3e", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 28983 1726883014.07340: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. <<< 28983 1726883014.07344: stderr chunk (state=3): >>><<< 28983 1726883014.07347: stdout chunk (state=3): >>><<< 28983 1726883014.07351: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3307", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ExecMainStartTimestampMonotonic": "237115079", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3307", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4837", "MemoryCurrent": "4444160", "MemoryAvailable": "infinity", "CPUUsageNSec": "1519686000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target NetworkManager-wait-online.service cloud-init.service network.target network.service multi-user.target", "After": "systemd-journald.socket dbus-broker.service dbus.socket system.slice cloud-init-local.service sysinit.target network-pre.target basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:40:53 EDT", "StateChangeTimestampMonotonic": "816797668", "InactiveExitTimestamp": "Fri 2024-09-20 21:31:13 EDT", "InactiveExitTimestampMonotonic": "237115438", "ActiveEnterTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ActiveEnterTimestampMonotonic": "237210316", "ActiveExitTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ActiveExitTimestampMonotonic": "237082173", "InactiveEnterTimestamp": "Fri 2024-09-20 21:31:13 EDT", "InactiveEnterTimestampMonotonic": "237109124", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ConditionTimestampMonotonic": "237109810", "AssertTimestamp": "Fri 2024-09-20 21:31:13 EDT", "AssertTimestampMonotonic": "237109813", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ac5f6302898c463683c4bdf580ab7e3e", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726883014.07600: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726883013.6129196-30556-273765168497190/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726883014.07618: _low_level_execute_command(): starting 28983 1726883014.07623: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726883013.6129196-30556-273765168497190/ > /dev/null 2>&1 && sleep 0' 28983 1726883014.08103: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883014.08107: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883014.08110: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883014.08113: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883014.08164: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883014.08172: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883014.08248: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883014.10510: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883014.10536: stderr chunk (state=3): >>><<< 28983 1726883014.10539: stdout chunk (state=3): >>><<< 28983 1726883014.10554: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883014.10739: handler run complete 28983 1726883014.10742: attempt loop complete, returning result 28983 1726883014.10744: _execute() done 28983 1726883014.10747: dumping result to json 28983 1726883014.10749: done dumping result, returning 28983 1726883014.10751: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affe814-3a2d-b16d-c0a7-000000000b3f] 28983 1726883014.10753: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000b3f ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28983 1726883014.11111: no more pending results, returning what we have 28983 1726883014.11115: results queue empty 28983 1726883014.11116: checking for any_errors_fatal 28983 1726883014.11124: done checking for any_errors_fatal 28983 1726883014.11125: checking for max_fail_percentage 28983 1726883014.11127: done checking for max_fail_percentage 28983 1726883014.11128: checking to see if all hosts have failed and the running result is not ok 28983 1726883014.11129: done checking to see if all hosts have failed 28983 1726883014.11130: getting the remaining hosts for this loop 28983 1726883014.11132: done getting the remaining hosts for this loop 28983 1726883014.11139: getting the next task for host managed_node2 28983 1726883014.11148: done getting next task for host managed_node2 28983 1726883014.11152: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 28983 1726883014.11157: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883014.11175: getting variables 28983 1726883014.11177: in VariableManager get_vars() 28983 1726883014.11349: Calling all_inventory to load vars for managed_node2 28983 1726883014.11353: Calling groups_inventory to load vars for managed_node2 28983 1726883014.11356: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883014.11367: Calling all_plugins_play to load vars for managed_node2 28983 1726883014.11370: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883014.11375: Calling groups_plugins_play to load vars for managed_node2 28983 1726883014.11951: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000b3f 28983 1726883014.11954: WORKER PROCESS EXITING 28983 1726883014.13650: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883014.16678: done with get_vars() 28983 1726883014.16717: done getting variables 28983 1726883014.16796: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:43:34 -0400 (0:00:00.738) 0:00:44.166 ****** 28983 1726883014.16848: entering _queue_task() for managed_node2/service 28983 1726883014.17197: worker is 1 (out of 1 available) 28983 1726883014.17212: exiting _queue_task() for managed_node2/service 28983 1726883014.17226: done queuing things up, now waiting for results queue to drain 28983 1726883014.17228: waiting for pending results... 28983 1726883014.17565: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 28983 1726883014.17770: in run() - task 0affe814-3a2d-b16d-c0a7-000000000b40 28983 1726883014.17803: variable 'ansible_search_path' from source: unknown 28983 1726883014.17814: variable 'ansible_search_path' from source: unknown 28983 1726883014.17890: calling self._execute() 28983 1726883014.17980: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883014.17997: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883014.18016: variable 'omit' from source: magic vars 28983 1726883014.18486: variable 'ansible_distribution_major_version' from source: facts 28983 1726883014.18542: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883014.18673: variable 'network_provider' from source: set_fact 28983 1726883014.18688: Evaluated conditional (network_provider == "nm"): True 28983 1726883014.18815: variable '__network_wpa_supplicant_required' from source: role '' defaults 28983 1726883014.18938: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 28983 1726883014.19199: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883014.22092: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883014.22239: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883014.22242: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883014.22276: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883014.22315: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883014.22417: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883014.22459: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883014.22504: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883014.22561: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883014.22588: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883014.22655: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883014.22692: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883014.22825: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883014.22829: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883014.22831: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883014.22868: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883014.22905: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883014.22943: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883014.23002: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883014.23024: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883014.23216: variable 'network_connections' from source: include params 28983 1726883014.23236: variable 'interface' from source: play vars 28983 1726883014.23325: variable 'interface' from source: play vars 28983 1726883014.23429: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883014.23666: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883014.23723: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883014.23767: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883014.23824: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883014.23870: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883014.23933: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883014.23947: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883014.23986: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883014.24048: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883014.24401: variable 'network_connections' from source: include params 28983 1726883014.24639: variable 'interface' from source: play vars 28983 1726883014.24642: variable 'interface' from source: play vars 28983 1726883014.24644: Evaluated conditional (__network_wpa_supplicant_required): False 28983 1726883014.24647: when evaluation is False, skipping this task 28983 1726883014.24649: _execute() done 28983 1726883014.24651: dumping result to json 28983 1726883014.24653: done dumping result, returning 28983 1726883014.24655: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affe814-3a2d-b16d-c0a7-000000000b40] 28983 1726883014.24664: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000b40 28983 1726883014.24742: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000b40 28983 1726883014.24746: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 28983 1726883014.24806: no more pending results, returning what we have 28983 1726883014.24810: results queue empty 28983 1726883014.24811: checking for any_errors_fatal 28983 1726883014.24837: done checking for any_errors_fatal 28983 1726883014.24839: checking for max_fail_percentage 28983 1726883014.24842: done checking for max_fail_percentage 28983 1726883014.24843: checking to see if all hosts have failed and the running result is not ok 28983 1726883014.24844: done checking to see if all hosts have failed 28983 1726883014.24845: getting the remaining hosts for this loop 28983 1726883014.24847: done getting the remaining hosts for this loop 28983 1726883014.24853: getting the next task for host managed_node2 28983 1726883014.24864: done getting next task for host managed_node2 28983 1726883014.24869: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 28983 1726883014.24878: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883014.24905: getting variables 28983 1726883014.24907: in VariableManager get_vars() 28983 1726883014.24954: Calling all_inventory to load vars for managed_node2 28983 1726883014.24958: Calling groups_inventory to load vars for managed_node2 28983 1726883014.24961: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883014.24976: Calling all_plugins_play to load vars for managed_node2 28983 1726883014.24981: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883014.24985: Calling groups_plugins_play to load vars for managed_node2 28983 1726883014.27674: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883014.30738: done with get_vars() 28983 1726883014.30775: done getting variables 28983 1726883014.30847: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:43:34 -0400 (0:00:00.140) 0:00:44.306 ****** 28983 1726883014.30891: entering _queue_task() for managed_node2/service 28983 1726883014.31225: worker is 1 (out of 1 available) 28983 1726883014.31343: exiting _queue_task() for managed_node2/service 28983 1726883014.31357: done queuing things up, now waiting for results queue to drain 28983 1726883014.31359: waiting for pending results... 28983 1726883014.31651: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 28983 1726883014.31806: in run() - task 0affe814-3a2d-b16d-c0a7-000000000b41 28983 1726883014.31832: variable 'ansible_search_path' from source: unknown 28983 1726883014.31846: variable 'ansible_search_path' from source: unknown 28983 1726883014.31896: calling self._execute() 28983 1726883014.32049: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883014.32052: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883014.32055: variable 'omit' from source: magic vars 28983 1726883014.32531: variable 'ansible_distribution_major_version' from source: facts 28983 1726883014.32555: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883014.32840: variable 'network_provider' from source: set_fact 28983 1726883014.32844: Evaluated conditional (network_provider == "initscripts"): False 28983 1726883014.32847: when evaluation is False, skipping this task 28983 1726883014.32849: _execute() done 28983 1726883014.32852: dumping result to json 28983 1726883014.32854: done dumping result, returning 28983 1726883014.32856: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [0affe814-3a2d-b16d-c0a7-000000000b41] 28983 1726883014.32859: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000b41 28983 1726883014.32943: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000b41 28983 1726883014.32947: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28983 1726883014.33006: no more pending results, returning what we have 28983 1726883014.33011: results queue empty 28983 1726883014.33012: checking for any_errors_fatal 28983 1726883014.33021: done checking for any_errors_fatal 28983 1726883014.33022: checking for max_fail_percentage 28983 1726883014.33025: done checking for max_fail_percentage 28983 1726883014.33026: checking to see if all hosts have failed and the running result is not ok 28983 1726883014.33028: done checking to see if all hosts have failed 28983 1726883014.33028: getting the remaining hosts for this loop 28983 1726883014.33031: done getting the remaining hosts for this loop 28983 1726883014.33038: getting the next task for host managed_node2 28983 1726883014.33050: done getting next task for host managed_node2 28983 1726883014.33056: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 28983 1726883014.33063: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883014.33093: getting variables 28983 1726883014.33095: in VariableManager get_vars() 28983 1726883014.33342: Calling all_inventory to load vars for managed_node2 28983 1726883014.33346: Calling groups_inventory to load vars for managed_node2 28983 1726883014.33349: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883014.33358: Calling all_plugins_play to load vars for managed_node2 28983 1726883014.33362: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883014.33366: Calling groups_plugins_play to load vars for managed_node2 28983 1726883014.35585: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883014.43580: done with get_vars() 28983 1726883014.43621: done getting variables 28983 1726883014.43685: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:43:34 -0400 (0:00:00.128) 0:00:44.434 ****** 28983 1726883014.43722: entering _queue_task() for managed_node2/copy 28983 1726883014.44107: worker is 1 (out of 1 available) 28983 1726883014.44121: exiting _queue_task() for managed_node2/copy 28983 1726883014.44239: done queuing things up, now waiting for results queue to drain 28983 1726883014.44242: waiting for pending results... 28983 1726883014.44557: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 28983 1726883014.44740: in run() - task 0affe814-3a2d-b16d-c0a7-000000000b42 28983 1726883014.44746: variable 'ansible_search_path' from source: unknown 28983 1726883014.44751: variable 'ansible_search_path' from source: unknown 28983 1726883014.44784: calling self._execute() 28983 1726883014.44910: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883014.44926: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883014.44979: variable 'omit' from source: magic vars 28983 1726883014.45437: variable 'ansible_distribution_major_version' from source: facts 28983 1726883014.45460: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883014.45629: variable 'network_provider' from source: set_fact 28983 1726883014.45736: Evaluated conditional (network_provider == "initscripts"): False 28983 1726883014.45740: when evaluation is False, skipping this task 28983 1726883014.45745: _execute() done 28983 1726883014.45748: dumping result to json 28983 1726883014.45750: done dumping result, returning 28983 1726883014.45753: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affe814-3a2d-b16d-c0a7-000000000b42] 28983 1726883014.45756: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000b42 28983 1726883014.45857: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000b42 28983 1726883014.45861: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 28983 1726883014.45920: no more pending results, returning what we have 28983 1726883014.45924: results queue empty 28983 1726883014.45925: checking for any_errors_fatal 28983 1726883014.45938: done checking for any_errors_fatal 28983 1726883014.45940: checking for max_fail_percentage 28983 1726883014.45942: done checking for max_fail_percentage 28983 1726883014.45943: checking to see if all hosts have failed and the running result is not ok 28983 1726883014.45944: done checking to see if all hosts have failed 28983 1726883014.45945: getting the remaining hosts for this loop 28983 1726883014.45948: done getting the remaining hosts for this loop 28983 1726883014.45953: getting the next task for host managed_node2 28983 1726883014.45963: done getting next task for host managed_node2 28983 1726883014.45969: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 28983 1726883014.45979: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883014.46005: getting variables 28983 1726883014.46007: in VariableManager get_vars() 28983 1726883014.46251: Calling all_inventory to load vars for managed_node2 28983 1726883014.46255: Calling groups_inventory to load vars for managed_node2 28983 1726883014.46259: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883014.46269: Calling all_plugins_play to load vars for managed_node2 28983 1726883014.46275: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883014.46279: Calling groups_plugins_play to load vars for managed_node2 28983 1726883014.48509: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883014.51489: done with get_vars() 28983 1726883014.51525: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:43:34 -0400 (0:00:00.079) 0:00:44.514 ****** 28983 1726883014.51625: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 28983 1726883014.51946: worker is 1 (out of 1 available) 28983 1726883014.51960: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 28983 1726883014.51975: done queuing things up, now waiting for results queue to drain 28983 1726883014.51977: waiting for pending results... 28983 1726883014.52454: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 28983 1726883014.52479: in run() - task 0affe814-3a2d-b16d-c0a7-000000000b43 28983 1726883014.52501: variable 'ansible_search_path' from source: unknown 28983 1726883014.52509: variable 'ansible_search_path' from source: unknown 28983 1726883014.52560: calling self._execute() 28983 1726883014.52672: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883014.52687: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883014.52704: variable 'omit' from source: magic vars 28983 1726883014.53165: variable 'ansible_distribution_major_version' from source: facts 28983 1726883014.53184: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883014.53204: variable 'omit' from source: magic vars 28983 1726883014.53313: variable 'omit' from source: magic vars 28983 1726883014.53495: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883014.56142: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883014.56657: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883014.56939: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883014.56944: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883014.56947: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883014.56950: variable 'network_provider' from source: set_fact 28983 1726883014.57047: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883014.57094: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883014.57138: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883014.57202: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883014.57226: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883014.57325: variable 'omit' from source: magic vars 28983 1726883014.57471: variable 'omit' from source: magic vars 28983 1726883014.57616: variable 'network_connections' from source: include params 28983 1726883014.57633: variable 'interface' from source: play vars 28983 1726883014.57717: variable 'interface' from source: play vars 28983 1726883014.57919: variable 'omit' from source: magic vars 28983 1726883014.57939: variable '__lsr_ansible_managed' from source: task vars 28983 1726883014.58015: variable '__lsr_ansible_managed' from source: task vars 28983 1726883014.58240: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 28983 1726883014.58559: Loaded config def from plugin (lookup/template) 28983 1726883014.58572: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 28983 1726883014.58613: File lookup term: get_ansible_managed.j2 28983 1726883014.58620: variable 'ansible_search_path' from source: unknown 28983 1726883014.58629: evaluation_path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 28983 1726883014.58650: search_path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 28983 1726883014.58687: variable 'ansible_search_path' from source: unknown 28983 1726883014.72704: variable 'ansible_managed' from source: unknown 28983 1726883014.73019: variable 'omit' from source: magic vars 28983 1726883014.73023: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883014.73047: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883014.73127: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883014.73130: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883014.73133: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883014.73151: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883014.73165: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883014.73175: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883014.73301: Set connection var ansible_connection to ssh 28983 1726883014.73319: Set connection var ansible_shell_executable to /bin/sh 28983 1726883014.73339: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883014.73355: Set connection var ansible_timeout to 10 28983 1726883014.73366: Set connection var ansible_pipelining to False 28983 1726883014.73378: Set connection var ansible_shell_type to sh 28983 1726883014.73452: variable 'ansible_shell_executable' from source: unknown 28983 1726883014.73455: variable 'ansible_connection' from source: unknown 28983 1726883014.73457: variable 'ansible_module_compression' from source: unknown 28983 1726883014.73459: variable 'ansible_shell_type' from source: unknown 28983 1726883014.73462: variable 'ansible_shell_executable' from source: unknown 28983 1726883014.73464: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883014.73466: variable 'ansible_pipelining' from source: unknown 28983 1726883014.73468: variable 'ansible_timeout' from source: unknown 28983 1726883014.73470: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883014.73633: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28983 1726883014.73704: variable 'omit' from source: magic vars 28983 1726883014.73707: starting attempt loop 28983 1726883014.73709: running the handler 28983 1726883014.73711: _low_level_execute_command(): starting 28983 1726883014.73717: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726883014.74454: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883014.74478: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883014.74494: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883014.74514: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883014.74532: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726883014.74555: stderr chunk (state=3): >>>debug2: match not found <<< 28983 1726883014.74593: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883014.74619: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883014.74703: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883014.74723: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883014.74749: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883014.74767: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883014.74882: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883014.76651: stdout chunk (state=3): >>>/root <<< 28983 1726883014.76795: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883014.76823: stderr chunk (state=3): >>><<< 28983 1726883014.76825: stdout chunk (state=3): >>><<< 28983 1726883014.76841: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883014.76895: _low_level_execute_command(): starting 28983 1726883014.76900: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726883014.7684715-30584-117955473678317 `" && echo ansible-tmp-1726883014.7684715-30584-117955473678317="` echo /root/.ansible/tmp/ansible-tmp-1726883014.7684715-30584-117955473678317 `" ) && sleep 0' 28983 1726883014.77305: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883014.77309: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726883014.77311: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883014.77314: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883014.77316: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883014.77367: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883014.77372: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883014.77448: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883014.79442: stdout chunk (state=3): >>>ansible-tmp-1726883014.7684715-30584-117955473678317=/root/.ansible/tmp/ansible-tmp-1726883014.7684715-30584-117955473678317 <<< 28983 1726883014.79559: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883014.79603: stderr chunk (state=3): >>><<< 28983 1726883014.79607: stdout chunk (state=3): >>><<< 28983 1726883014.79622: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726883014.7684715-30584-117955473678317=/root/.ansible/tmp/ansible-tmp-1726883014.7684715-30584-117955473678317 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883014.79665: variable 'ansible_module_compression' from source: unknown 28983 1726883014.79704: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 28983 1726883014.79744: variable 'ansible_facts' from source: unknown 28983 1726883014.79832: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726883014.7684715-30584-117955473678317/AnsiballZ_network_connections.py 28983 1726883014.79944: Sending initial data 28983 1726883014.79948: Sent initial data (168 bytes) 28983 1726883014.80391: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883014.80397: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883014.80401: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883014.80404: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found <<< 28983 1726883014.80406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883014.80450: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883014.80457: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883014.80524: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883014.82163: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726883014.82239: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726883014.82320: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmplnfqvmgm /root/.ansible/tmp/ansible-tmp-1726883014.7684715-30584-117955473678317/AnsiballZ_network_connections.py <<< 28983 1726883014.82340: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726883014.7684715-30584-117955473678317/AnsiballZ_network_connections.py" <<< 28983 1726883014.82395: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 28983 1726883014.82424: stderr chunk (state=3): >>>debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmplnfqvmgm" to remote "/root/.ansible/tmp/ansible-tmp-1726883014.7684715-30584-117955473678317/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726883014.7684715-30584-117955473678317/AnsiballZ_network_connections.py" <<< 28983 1726883014.84294: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883014.84337: stderr chunk (state=3): >>><<< 28983 1726883014.84349: stdout chunk (state=3): >>><<< 28983 1726883014.84421: done transferring module to remote 28983 1726883014.84424: _low_level_execute_command(): starting 28983 1726883014.84426: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726883014.7684715-30584-117955473678317/ /root/.ansible/tmp/ansible-tmp-1726883014.7684715-30584-117955473678317/AnsiballZ_network_connections.py && sleep 0' 28983 1726883014.85091: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883014.85158: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883014.85243: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883014.85268: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883014.85323: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883014.85410: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883014.87445: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883014.87448: stdout chunk (state=3): >>><<< 28983 1726883014.87451: stderr chunk (state=3): >>><<< 28983 1726883014.87453: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883014.87456: _low_level_execute_command(): starting 28983 1726883014.87458: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726883014.7684715-30584-117955473678317/AnsiballZ_network_connections.py && sleep 0' 28983 1726883014.88124: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883014.88242: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883014.88275: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883014.88400: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883015.20397: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, f251b268-4387-4b61-a766-95deb90f678a\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 28983 1726883015.23242: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. <<< 28983 1726883015.23248: stdout chunk (state=3): >>><<< 28983 1726883015.23251: stderr chunk (state=3): >>><<< 28983 1726883015.23256: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, f251b268-4387-4b61-a766-95deb90f678a\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726883015.23259: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'persistent_state': 'present', 'type': 'bridge', 'ip': {'dhcp4': False, 'auto6': False}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726883014.7684715-30584-117955473678317/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726883015.23263: _low_level_execute_command(): starting 28983 1726883015.23266: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726883014.7684715-30584-117955473678317/ > /dev/null 2>&1 && sleep 0' 28983 1726883015.23813: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883015.23821: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726883015.23843: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883015.23847: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883015.23868: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883015.23930: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883015.23938: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883015.24006: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883015.26052: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883015.26055: stdout chunk (state=3): >>><<< 28983 1726883015.26057: stderr chunk (state=3): >>><<< 28983 1726883015.26128: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883015.26132: handler run complete 28983 1726883015.26149: attempt loop complete, returning result 28983 1726883015.26152: _execute() done 28983 1726883015.26154: dumping result to json 28983 1726883015.26169: done dumping result, returning 28983 1726883015.26175: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affe814-3a2d-b16d-c0a7-000000000b43] 28983 1726883015.26192: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000b43 28983 1726883015.26315: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000b43 changed: [managed_node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [002] #0, state:None persistent_state:present, 'statebr': add connection statebr, f251b268-4387-4b61-a766-95deb90f678a 28983 1726883015.26453: no more pending results, returning what we have 28983 1726883015.26457: results queue empty 28983 1726883015.26458: checking for any_errors_fatal 28983 1726883015.26465: done checking for any_errors_fatal 28983 1726883015.26466: checking for max_fail_percentage 28983 1726883015.26467: done checking for max_fail_percentage 28983 1726883015.26469: checking to see if all hosts have failed and the running result is not ok 28983 1726883015.26469: done checking to see if all hosts have failed 28983 1726883015.26470: getting the remaining hosts for this loop 28983 1726883015.26472: done getting the remaining hosts for this loop 28983 1726883015.26477: getting the next task for host managed_node2 28983 1726883015.26486: done getting next task for host managed_node2 28983 1726883015.26491: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 28983 1726883015.26498: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883015.26512: getting variables 28983 1726883015.26513: in VariableManager get_vars() 28983 1726883015.26557: Calling all_inventory to load vars for managed_node2 28983 1726883015.26560: Calling groups_inventory to load vars for managed_node2 28983 1726883015.26563: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883015.26569: WORKER PROCESS EXITING 28983 1726883015.26579: Calling all_plugins_play to load vars for managed_node2 28983 1726883015.26583: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883015.26586: Calling groups_plugins_play to load vars for managed_node2 28983 1726883015.28035: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883015.30410: done with get_vars() 28983 1726883015.30435: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:43:35 -0400 (0:00:00.788) 0:00:45.302 ****** 28983 1726883015.30508: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 28983 1726883015.30759: worker is 1 (out of 1 available) 28983 1726883015.30772: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 28983 1726883015.30783: done queuing things up, now waiting for results queue to drain 28983 1726883015.30785: waiting for pending results... 28983 1726883015.30993: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 28983 1726883015.31129: in run() - task 0affe814-3a2d-b16d-c0a7-000000000b44 28983 1726883015.31145: variable 'ansible_search_path' from source: unknown 28983 1726883015.31149: variable 'ansible_search_path' from source: unknown 28983 1726883015.31184: calling self._execute() 28983 1726883015.31266: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883015.31273: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883015.31289: variable 'omit' from source: magic vars 28983 1726883015.31708: variable 'ansible_distribution_major_version' from source: facts 28983 1726883015.31712: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883015.31939: variable 'network_state' from source: role '' defaults 28983 1726883015.31942: Evaluated conditional (network_state != {}): False 28983 1726883015.31945: when evaluation is False, skipping this task 28983 1726883015.31947: _execute() done 28983 1726883015.31949: dumping result to json 28983 1726883015.31951: done dumping result, returning 28983 1726883015.31954: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [0affe814-3a2d-b16d-c0a7-000000000b44] 28983 1726883015.31957: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000b44 28983 1726883015.32040: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000b44 28983 1726883015.32044: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28983 1726883015.32106: no more pending results, returning what we have 28983 1726883015.32110: results queue empty 28983 1726883015.32111: checking for any_errors_fatal 28983 1726883015.32124: done checking for any_errors_fatal 28983 1726883015.32125: checking for max_fail_percentage 28983 1726883015.32127: done checking for max_fail_percentage 28983 1726883015.32128: checking to see if all hosts have failed and the running result is not ok 28983 1726883015.32129: done checking to see if all hosts have failed 28983 1726883015.32130: getting the remaining hosts for this loop 28983 1726883015.32132: done getting the remaining hosts for this loop 28983 1726883015.32139: getting the next task for host managed_node2 28983 1726883015.32341: done getting next task for host managed_node2 28983 1726883015.32346: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 28983 1726883015.32353: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883015.32372: getting variables 28983 1726883015.32374: in VariableManager get_vars() 28983 1726883015.32408: Calling all_inventory to load vars for managed_node2 28983 1726883015.32411: Calling groups_inventory to load vars for managed_node2 28983 1726883015.32413: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883015.32421: Calling all_plugins_play to load vars for managed_node2 28983 1726883015.32424: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883015.32427: Calling groups_plugins_play to load vars for managed_node2 28983 1726883015.34477: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883015.36656: done with get_vars() 28983 1726883015.36682: done getting variables 28983 1726883015.36730: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:43:35 -0400 (0:00:00.062) 0:00:45.365 ****** 28983 1726883015.36760: entering _queue_task() for managed_node2/debug 28983 1726883015.36992: worker is 1 (out of 1 available) 28983 1726883015.37008: exiting _queue_task() for managed_node2/debug 28983 1726883015.37019: done queuing things up, now waiting for results queue to drain 28983 1726883015.37021: waiting for pending results... 28983 1726883015.37230: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 28983 1726883015.37543: in run() - task 0affe814-3a2d-b16d-c0a7-000000000b45 28983 1726883015.37547: variable 'ansible_search_path' from source: unknown 28983 1726883015.37550: variable 'ansible_search_path' from source: unknown 28983 1726883015.37554: calling self._execute() 28983 1726883015.37589: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883015.37605: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883015.37626: variable 'omit' from source: magic vars 28983 1726883015.38136: variable 'ansible_distribution_major_version' from source: facts 28983 1726883015.38158: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883015.38174: variable 'omit' from source: magic vars 28983 1726883015.38264: variable 'omit' from source: magic vars 28983 1726883015.38304: variable 'omit' from source: magic vars 28983 1726883015.38346: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883015.38377: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883015.38396: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883015.38413: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883015.38424: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883015.38463: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883015.38467: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883015.38471: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883015.38554: Set connection var ansible_connection to ssh 28983 1726883015.38565: Set connection var ansible_shell_executable to /bin/sh 28983 1726883015.38575: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883015.38585: Set connection var ansible_timeout to 10 28983 1726883015.38591: Set connection var ansible_pipelining to False 28983 1726883015.38594: Set connection var ansible_shell_type to sh 28983 1726883015.38613: variable 'ansible_shell_executable' from source: unknown 28983 1726883015.38616: variable 'ansible_connection' from source: unknown 28983 1726883015.38619: variable 'ansible_module_compression' from source: unknown 28983 1726883015.38622: variable 'ansible_shell_type' from source: unknown 28983 1726883015.38627: variable 'ansible_shell_executable' from source: unknown 28983 1726883015.38629: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883015.38636: variable 'ansible_pipelining' from source: unknown 28983 1726883015.38639: variable 'ansible_timeout' from source: unknown 28983 1726883015.38646: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883015.38763: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883015.38774: variable 'omit' from source: magic vars 28983 1726883015.38783: starting attempt loop 28983 1726883015.38786: running the handler 28983 1726883015.38894: variable '__network_connections_result' from source: set_fact 28983 1726883015.38939: handler run complete 28983 1726883015.38955: attempt loop complete, returning result 28983 1726883015.38958: _execute() done 28983 1726883015.38961: dumping result to json 28983 1726883015.38966: done dumping result, returning 28983 1726883015.38980: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affe814-3a2d-b16d-c0a7-000000000b45] 28983 1726883015.38983: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000b45 28983 1726883015.39075: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000b45 28983 1726883015.39078: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, f251b268-4387-4b61-a766-95deb90f678a" ] } 28983 1726883015.39166: no more pending results, returning what we have 28983 1726883015.39170: results queue empty 28983 1726883015.39171: checking for any_errors_fatal 28983 1726883015.39176: done checking for any_errors_fatal 28983 1726883015.39176: checking for max_fail_percentage 28983 1726883015.39178: done checking for max_fail_percentage 28983 1726883015.39179: checking to see if all hosts have failed and the running result is not ok 28983 1726883015.39180: done checking to see if all hosts have failed 28983 1726883015.39181: getting the remaining hosts for this loop 28983 1726883015.39183: done getting the remaining hosts for this loop 28983 1726883015.39187: getting the next task for host managed_node2 28983 1726883015.39196: done getting next task for host managed_node2 28983 1726883015.39200: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 28983 1726883015.39205: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883015.39217: getting variables 28983 1726883015.39219: in VariableManager get_vars() 28983 1726883015.39254: Calling all_inventory to load vars for managed_node2 28983 1726883015.39259: Calling groups_inventory to load vars for managed_node2 28983 1726883015.39262: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883015.39273: Calling all_plugins_play to load vars for managed_node2 28983 1726883015.39276: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883015.39279: Calling groups_plugins_play to load vars for managed_node2 28983 1726883015.40468: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883015.42083: done with get_vars() 28983 1726883015.42107: done getting variables 28983 1726883015.42154: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:43:35 -0400 (0:00:00.054) 0:00:45.419 ****** 28983 1726883015.42186: entering _queue_task() for managed_node2/debug 28983 1726883015.42391: worker is 1 (out of 1 available) 28983 1726883015.42407: exiting _queue_task() for managed_node2/debug 28983 1726883015.42419: done queuing things up, now waiting for results queue to drain 28983 1726883015.42421: waiting for pending results... 28983 1726883015.42606: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 28983 1726883015.42722: in run() - task 0affe814-3a2d-b16d-c0a7-000000000b46 28983 1726883015.42738: variable 'ansible_search_path' from source: unknown 28983 1726883015.42742: variable 'ansible_search_path' from source: unknown 28983 1726883015.42774: calling self._execute() 28983 1726883015.42849: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883015.42855: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883015.42867: variable 'omit' from source: magic vars 28983 1726883015.43182: variable 'ansible_distribution_major_version' from source: facts 28983 1726883015.43193: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883015.43201: variable 'omit' from source: magic vars 28983 1726883015.43252: variable 'omit' from source: magic vars 28983 1726883015.43282: variable 'omit' from source: magic vars 28983 1726883015.43320: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883015.43349: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883015.43368: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883015.43386: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883015.43395: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883015.43425: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883015.43429: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883015.43432: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883015.43511: Set connection var ansible_connection to ssh 28983 1726883015.43522: Set connection var ansible_shell_executable to /bin/sh 28983 1726883015.43531: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883015.43548: Set connection var ansible_timeout to 10 28983 1726883015.43554: Set connection var ansible_pipelining to False 28983 1726883015.43557: Set connection var ansible_shell_type to sh 28983 1726883015.43578: variable 'ansible_shell_executable' from source: unknown 28983 1726883015.43581: variable 'ansible_connection' from source: unknown 28983 1726883015.43584: variable 'ansible_module_compression' from source: unknown 28983 1726883015.43589: variable 'ansible_shell_type' from source: unknown 28983 1726883015.43591: variable 'ansible_shell_executable' from source: unknown 28983 1726883015.43596: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883015.43601: variable 'ansible_pipelining' from source: unknown 28983 1726883015.43605: variable 'ansible_timeout' from source: unknown 28983 1726883015.43610: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883015.43729: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883015.43743: variable 'omit' from source: magic vars 28983 1726883015.43749: starting attempt loop 28983 1726883015.43752: running the handler 28983 1726883015.43796: variable '__network_connections_result' from source: set_fact 28983 1726883015.43858: variable '__network_connections_result' from source: set_fact 28983 1726883015.43957: handler run complete 28983 1726883015.43985: attempt loop complete, returning result 28983 1726883015.43989: _execute() done 28983 1726883015.43992: dumping result to json 28983 1726883015.43995: done dumping result, returning 28983 1726883015.44003: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affe814-3a2d-b16d-c0a7-000000000b46] 28983 1726883015.44013: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000b46 28983 1726883015.44117: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000b46 28983 1726883015.44120: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, f251b268-4387-4b61-a766-95deb90f678a\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, f251b268-4387-4b61-a766-95deb90f678a" ] } } 28983 1726883015.44221: no more pending results, returning what we have 28983 1726883015.44224: results queue empty 28983 1726883015.44225: checking for any_errors_fatal 28983 1726883015.44231: done checking for any_errors_fatal 28983 1726883015.44231: checking for max_fail_percentage 28983 1726883015.44233: done checking for max_fail_percentage 28983 1726883015.44242: checking to see if all hosts have failed and the running result is not ok 28983 1726883015.44243: done checking to see if all hosts have failed 28983 1726883015.44244: getting the remaining hosts for this loop 28983 1726883015.44245: done getting the remaining hosts for this loop 28983 1726883015.44249: getting the next task for host managed_node2 28983 1726883015.44257: done getting next task for host managed_node2 28983 1726883015.44261: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 28983 1726883015.44265: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883015.44279: getting variables 28983 1726883015.44280: in VariableManager get_vars() 28983 1726883015.44308: Calling all_inventory to load vars for managed_node2 28983 1726883015.44310: Calling groups_inventory to load vars for managed_node2 28983 1726883015.44311: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883015.44317: Calling all_plugins_play to load vars for managed_node2 28983 1726883015.44320: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883015.44322: Calling groups_plugins_play to load vars for managed_node2 28983 1726883015.45655: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883015.47845: done with get_vars() 28983 1726883015.47866: done getting variables 28983 1726883015.47911: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:43:35 -0400 (0:00:00.057) 0:00:45.477 ****** 28983 1726883015.47946: entering _queue_task() for managed_node2/debug 28983 1726883015.48150: worker is 1 (out of 1 available) 28983 1726883015.48163: exiting _queue_task() for managed_node2/debug 28983 1726883015.48177: done queuing things up, now waiting for results queue to drain 28983 1726883015.48179: waiting for pending results... 28983 1726883015.48364: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 28983 1726883015.48487: in run() - task 0affe814-3a2d-b16d-c0a7-000000000b47 28983 1726883015.48500: variable 'ansible_search_path' from source: unknown 28983 1726883015.48503: variable 'ansible_search_path' from source: unknown 28983 1726883015.48538: calling self._execute() 28983 1726883015.48610: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883015.48618: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883015.48631: variable 'omit' from source: magic vars 28983 1726883015.48933: variable 'ansible_distribution_major_version' from source: facts 28983 1726883015.48947: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883015.49046: variable 'network_state' from source: role '' defaults 28983 1726883015.49057: Evaluated conditional (network_state != {}): False 28983 1726883015.49061: when evaluation is False, skipping this task 28983 1726883015.49064: _execute() done 28983 1726883015.49068: dumping result to json 28983 1726883015.49073: done dumping result, returning 28983 1726883015.49085: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affe814-3a2d-b16d-c0a7-000000000b47] 28983 1726883015.49088: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000b47 28983 1726883015.49187: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000b47 28983 1726883015.49190: WORKER PROCESS EXITING skipping: [managed_node2] => { "false_condition": "network_state != {}" } 28983 1726883015.49240: no more pending results, returning what we have 28983 1726883015.49244: results queue empty 28983 1726883015.49245: checking for any_errors_fatal 28983 1726883015.49253: done checking for any_errors_fatal 28983 1726883015.49254: checking for max_fail_percentage 28983 1726883015.49256: done checking for max_fail_percentage 28983 1726883015.49257: checking to see if all hosts have failed and the running result is not ok 28983 1726883015.49258: done checking to see if all hosts have failed 28983 1726883015.49259: getting the remaining hosts for this loop 28983 1726883015.49260: done getting the remaining hosts for this loop 28983 1726883015.49264: getting the next task for host managed_node2 28983 1726883015.49273: done getting next task for host managed_node2 28983 1726883015.49277: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 28983 1726883015.49283: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883015.49303: getting variables 28983 1726883015.49304: in VariableManager get_vars() 28983 1726883015.49337: Calling all_inventory to load vars for managed_node2 28983 1726883015.49339: Calling groups_inventory to load vars for managed_node2 28983 1726883015.49341: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883015.49347: Calling all_plugins_play to load vars for managed_node2 28983 1726883015.49349: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883015.49352: Calling groups_plugins_play to load vars for managed_node2 28983 1726883015.51115: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883015.54286: done with get_vars() 28983 1726883015.54319: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:43:35 -0400 (0:00:00.064) 0:00:45.542 ****** 28983 1726883015.54429: entering _queue_task() for managed_node2/ping 28983 1726883015.54720: worker is 1 (out of 1 available) 28983 1726883015.54737: exiting _queue_task() for managed_node2/ping 28983 1726883015.54750: done queuing things up, now waiting for results queue to drain 28983 1726883015.54752: waiting for pending results... 28983 1726883015.55018: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 28983 1726883015.55223: in run() - task 0affe814-3a2d-b16d-c0a7-000000000b48 28983 1726883015.55228: variable 'ansible_search_path' from source: unknown 28983 1726883015.55230: variable 'ansible_search_path' from source: unknown 28983 1726883015.55261: calling self._execute() 28983 1726883015.55376: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883015.55392: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883015.55441: variable 'omit' from source: magic vars 28983 1726883015.55857: variable 'ansible_distribution_major_version' from source: facts 28983 1726883015.55882: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883015.55894: variable 'omit' from source: magic vars 28983 1726883015.55978: variable 'omit' from source: magic vars 28983 1726883015.56098: variable 'omit' from source: magic vars 28983 1726883015.56102: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883015.56129: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883015.56162: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883015.56187: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883015.56209: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883015.56251: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883015.56261: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883015.56269: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883015.56393: Set connection var ansible_connection to ssh 28983 1726883015.56411: Set connection var ansible_shell_executable to /bin/sh 28983 1726883015.56433: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883015.56451: Set connection var ansible_timeout to 10 28983 1726883015.56463: Set connection var ansible_pipelining to False 28983 1726883015.56470: Set connection var ansible_shell_type to sh 28983 1726883015.56500: variable 'ansible_shell_executable' from source: unknown 28983 1726883015.56509: variable 'ansible_connection' from source: unknown 28983 1726883015.56516: variable 'ansible_module_compression' from source: unknown 28983 1726883015.56528: variable 'ansible_shell_type' from source: unknown 28983 1726883015.56537: variable 'ansible_shell_executable' from source: unknown 28983 1726883015.56546: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883015.56558: variable 'ansible_pipelining' from source: unknown 28983 1726883015.56566: variable 'ansible_timeout' from source: unknown 28983 1726883015.56640: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883015.56815: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28983 1726883015.56838: variable 'omit' from source: magic vars 28983 1726883015.56850: starting attempt loop 28983 1726883015.56865: running the handler 28983 1726883015.56885: _low_level_execute_command(): starting 28983 1726883015.56898: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726883015.57806: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883015.57847: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883015.57863: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883015.57959: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883015.59764: stdout chunk (state=3): >>>/root <<< 28983 1726883015.59939: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883015.59953: stderr chunk (state=3): >>><<< 28983 1726883015.59965: stdout chunk (state=3): >>><<< 28983 1726883015.60002: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883015.60025: _low_level_execute_command(): starting 28983 1726883015.60040: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726883015.6001034-30619-194744892064069 `" && echo ansible-tmp-1726883015.6001034-30619-194744892064069="` echo /root/.ansible/tmp/ansible-tmp-1726883015.6001034-30619-194744892064069 `" ) && sleep 0' 28983 1726883015.60740: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883015.60757: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883015.60784: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883015.60816: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883015.60904: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883015.60926: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883015.61004: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883015.61033: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883015.61077: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883015.61201: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883015.63241: stdout chunk (state=3): >>>ansible-tmp-1726883015.6001034-30619-194744892064069=/root/.ansible/tmp/ansible-tmp-1726883015.6001034-30619-194744892064069 <<< 28983 1726883015.63639: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883015.63643: stdout chunk (state=3): >>><<< 28983 1726883015.63645: stderr chunk (state=3): >>><<< 28983 1726883015.63648: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726883015.6001034-30619-194744892064069=/root/.ansible/tmp/ansible-tmp-1726883015.6001034-30619-194744892064069 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883015.63650: variable 'ansible_module_compression' from source: unknown 28983 1726883015.63652: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 28983 1726883015.63654: variable 'ansible_facts' from source: unknown 28983 1726883015.63685: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726883015.6001034-30619-194744892064069/AnsiballZ_ping.py 28983 1726883015.63907: Sending initial data 28983 1726883015.63910: Sent initial data (153 bytes) 28983 1726883015.64486: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883015.64550: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883015.64623: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883015.64647: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883015.64668: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883015.64775: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883015.66427: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726883015.66521: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726883015.66599: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmp5_9wt99r /root/.ansible/tmp/ansible-tmp-1726883015.6001034-30619-194744892064069/AnsiballZ_ping.py <<< 28983 1726883015.66611: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726883015.6001034-30619-194744892064069/AnsiballZ_ping.py" <<< 28983 1726883015.66673: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmp5_9wt99r" to remote "/root/.ansible/tmp/ansible-tmp-1726883015.6001034-30619-194744892064069/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726883015.6001034-30619-194744892064069/AnsiballZ_ping.py" <<< 28983 1726883015.68025: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883015.68028: stdout chunk (state=3): >>><<< 28983 1726883015.68030: stderr chunk (state=3): >>><<< 28983 1726883015.68032: done transferring module to remote 28983 1726883015.68052: _low_level_execute_command(): starting 28983 1726883015.68055: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726883015.6001034-30619-194744892064069/ /root/.ansible/tmp/ansible-tmp-1726883015.6001034-30619-194744892064069/AnsiballZ_ping.py && sleep 0' 28983 1726883015.68591: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883015.68613: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883015.68651: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883015.68665: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28983 1726883015.68727: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883015.68783: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883015.68808: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883015.68832: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883015.68933: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883015.71057: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883015.71064: stdout chunk (state=3): >>><<< 28983 1726883015.71067: stderr chunk (state=3): >>><<< 28983 1726883015.71089: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883015.71120: _low_level_execute_command(): starting 28983 1726883015.71221: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726883015.6001034-30619-194744892064069/AnsiballZ_ping.py && sleep 0' 28983 1726883015.72240: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883015.72244: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883015.72247: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883015.72249: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883015.72275: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883015.72278: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883015.72392: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883015.89543: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 28983 1726883015.91062: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. <<< 28983 1726883015.91067: stdout chunk (state=3): >>><<< 28983 1726883015.91069: stderr chunk (state=3): >>><<< 28983 1726883015.91226: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726883015.91230: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726883015.6001034-30619-194744892064069/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726883015.91237: _low_level_execute_command(): starting 28983 1726883015.91240: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726883015.6001034-30619-194744892064069/ > /dev/null 2>&1 && sleep 0' 28983 1726883015.91886: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883015.91912: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883015.91936: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883015.92036: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883015.92077: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883015.92098: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883015.92139: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883015.92246: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883015.94259: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883015.94263: stdout chunk (state=3): >>><<< 28983 1726883015.94265: stderr chunk (state=3): >>><<< 28983 1726883015.94495: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883015.94502: handler run complete 28983 1726883015.94505: attempt loop complete, returning result 28983 1726883015.94507: _execute() done 28983 1726883015.94509: dumping result to json 28983 1726883015.94512: done dumping result, returning 28983 1726883015.94514: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affe814-3a2d-b16d-c0a7-000000000b48] 28983 1726883015.94516: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000b48 28983 1726883015.94599: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000b48 28983 1726883015.94603: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "ping": "pong" } 28983 1726883015.94691: no more pending results, returning what we have 28983 1726883015.94696: results queue empty 28983 1726883015.94697: checking for any_errors_fatal 28983 1726883015.94705: done checking for any_errors_fatal 28983 1726883015.94706: checking for max_fail_percentage 28983 1726883015.94708: done checking for max_fail_percentage 28983 1726883015.94710: checking to see if all hosts have failed and the running result is not ok 28983 1726883015.94711: done checking to see if all hosts have failed 28983 1726883015.94714: getting the remaining hosts for this loop 28983 1726883015.94717: done getting the remaining hosts for this loop 28983 1726883015.94722: getting the next task for host managed_node2 28983 1726883015.94737: done getting next task for host managed_node2 28983 1726883015.94740: ^ task is: TASK: meta (role_complete) 28983 1726883015.94748: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883015.94766: getting variables 28983 1726883015.94769: in VariableManager get_vars() 28983 1726883015.94820: Calling all_inventory to load vars for managed_node2 28983 1726883015.94824: Calling groups_inventory to load vars for managed_node2 28983 1726883015.94826: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883015.94943: Calling all_plugins_play to load vars for managed_node2 28983 1726883015.95051: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883015.95064: Calling groups_plugins_play to load vars for managed_node2 28983 1726883015.96678: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883015.98836: done with get_vars() 28983 1726883015.98862: done getting variables 28983 1726883015.98937: done queuing things up, now waiting for results queue to drain 28983 1726883015.98938: results queue empty 28983 1726883015.98939: checking for any_errors_fatal 28983 1726883015.98941: done checking for any_errors_fatal 28983 1726883015.98942: checking for max_fail_percentage 28983 1726883015.98943: done checking for max_fail_percentage 28983 1726883015.98943: checking to see if all hosts have failed and the running result is not ok 28983 1726883015.98944: done checking to see if all hosts have failed 28983 1726883015.98944: getting the remaining hosts for this loop 28983 1726883015.98945: done getting the remaining hosts for this loop 28983 1726883015.98948: getting the next task for host managed_node2 28983 1726883015.98953: done getting next task for host managed_node2 28983 1726883015.98955: ^ task is: TASK: Show result 28983 1726883015.98957: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883015.98959: getting variables 28983 1726883015.98960: in VariableManager get_vars() 28983 1726883015.98968: Calling all_inventory to load vars for managed_node2 28983 1726883015.98970: Calling groups_inventory to load vars for managed_node2 28983 1726883015.98972: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883015.98976: Calling all_plugins_play to load vars for managed_node2 28983 1726883015.98978: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883015.98980: Calling groups_plugins_play to load vars for managed_node2 28983 1726883016.00155: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883016.02770: done with get_vars() 28983 1726883016.02807: done getting variables 28983 1726883016.02867: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show result] ************************************************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml:14 Friday 20 September 2024 21:43:36 -0400 (0:00:00.484) 0:00:46.026 ****** 28983 1726883016.02906: entering _queue_task() for managed_node2/debug 28983 1726883016.03409: worker is 1 (out of 1 available) 28983 1726883016.03423: exiting _queue_task() for managed_node2/debug 28983 1726883016.03642: done queuing things up, now waiting for results queue to drain 28983 1726883016.03644: waiting for pending results... 28983 1726883016.03966: running TaskExecutor() for managed_node2/TASK: Show result 28983 1726883016.03970: in run() - task 0affe814-3a2d-b16d-c0a7-000000000ad2 28983 1726883016.03973: variable 'ansible_search_path' from source: unknown 28983 1726883016.03976: variable 'ansible_search_path' from source: unknown 28983 1726883016.03993: calling self._execute() 28983 1726883016.04113: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883016.04128: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883016.04151: variable 'omit' from source: magic vars 28983 1726883016.04608: variable 'ansible_distribution_major_version' from source: facts 28983 1726883016.04627: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883016.04640: variable 'omit' from source: magic vars 28983 1726883016.04704: variable 'omit' from source: magic vars 28983 1726883016.04757: variable 'omit' from source: magic vars 28983 1726883016.04810: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883016.04862: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883016.04890: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883016.04931: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883016.04939: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883016.05040: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883016.05045: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883016.05048: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883016.05123: Set connection var ansible_connection to ssh 28983 1726883016.05144: Set connection var ansible_shell_executable to /bin/sh 28983 1726883016.05164: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883016.05179: Set connection var ansible_timeout to 10 28983 1726883016.05190: Set connection var ansible_pipelining to False 28983 1726883016.05197: Set connection var ansible_shell_type to sh 28983 1726883016.05227: variable 'ansible_shell_executable' from source: unknown 28983 1726883016.05237: variable 'ansible_connection' from source: unknown 28983 1726883016.05246: variable 'ansible_module_compression' from source: unknown 28983 1726883016.05254: variable 'ansible_shell_type' from source: unknown 28983 1726883016.05439: variable 'ansible_shell_executable' from source: unknown 28983 1726883016.05443: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883016.05445: variable 'ansible_pipelining' from source: unknown 28983 1726883016.05447: variable 'ansible_timeout' from source: unknown 28983 1726883016.05450: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883016.05453: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883016.05474: variable 'omit' from source: magic vars 28983 1726883016.05485: starting attempt loop 28983 1726883016.05493: running the handler 28983 1726883016.05551: variable '__network_connections_result' from source: set_fact 28983 1726883016.05651: variable '__network_connections_result' from source: set_fact 28983 1726883016.05812: handler run complete 28983 1726883016.05857: attempt loop complete, returning result 28983 1726883016.05865: _execute() done 28983 1726883016.05873: dumping result to json 28983 1726883016.05884: done dumping result, returning 28983 1726883016.05900: done running TaskExecutor() for managed_node2/TASK: Show result [0affe814-3a2d-b16d-c0a7-000000000ad2] 28983 1726883016.05910: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000ad2 28983 1726883016.06178: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000ad2 28983 1726883016.06182: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, f251b268-4387-4b61-a766-95deb90f678a\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, f251b268-4387-4b61-a766-95deb90f678a" ] } } 28983 1726883016.06278: no more pending results, returning what we have 28983 1726883016.06282: results queue empty 28983 1726883016.06284: checking for any_errors_fatal 28983 1726883016.06286: done checking for any_errors_fatal 28983 1726883016.06287: checking for max_fail_percentage 28983 1726883016.06289: done checking for max_fail_percentage 28983 1726883016.06291: checking to see if all hosts have failed and the running result is not ok 28983 1726883016.06292: done checking to see if all hosts have failed 28983 1726883016.06293: getting the remaining hosts for this loop 28983 1726883016.06296: done getting the remaining hosts for this loop 28983 1726883016.06301: getting the next task for host managed_node2 28983 1726883016.06312: done getting next task for host managed_node2 28983 1726883016.06316: ^ task is: TASK: Test 28983 1726883016.06320: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883016.06327: getting variables 28983 1726883016.06328: in VariableManager get_vars() 28983 1726883016.06370: Calling all_inventory to load vars for managed_node2 28983 1726883016.06374: Calling groups_inventory to load vars for managed_node2 28983 1726883016.06379: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883016.06390: Calling all_plugins_play to load vars for managed_node2 28983 1726883016.06395: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883016.06399: Calling groups_plugins_play to load vars for managed_node2 28983 1726883016.08777: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883016.11850: done with get_vars() 28983 1726883016.11884: done getting variables TASK [Test] ******************************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:30 Friday 20 September 2024 21:43:36 -0400 (0:00:00.090) 0:00:46.117 ****** 28983 1726883016.12002: entering _queue_task() for managed_node2/include_tasks 28983 1726883016.12332: worker is 1 (out of 1 available) 28983 1726883016.12546: exiting _queue_task() for managed_node2/include_tasks 28983 1726883016.12557: done queuing things up, now waiting for results queue to drain 28983 1726883016.12558: waiting for pending results... 28983 1726883016.12702: running TaskExecutor() for managed_node2/TASK: Test 28983 1726883016.12822: in run() - task 0affe814-3a2d-b16d-c0a7-000000000a4d 28983 1726883016.12893: variable 'ansible_search_path' from source: unknown 28983 1726883016.12897: variable 'ansible_search_path' from source: unknown 28983 1726883016.12914: variable 'lsr_test' from source: include params 28983 1726883016.13162: variable 'lsr_test' from source: include params 28983 1726883016.13248: variable 'omit' from source: magic vars 28983 1726883016.13406: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883016.13425: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883016.13541: variable 'omit' from source: magic vars 28983 1726883016.13749: variable 'ansible_distribution_major_version' from source: facts 28983 1726883016.13771: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883016.13784: variable 'item' from source: unknown 28983 1726883016.13875: variable 'item' from source: unknown 28983 1726883016.13919: variable 'item' from source: unknown 28983 1726883016.14006: variable 'item' from source: unknown 28983 1726883016.14268: dumping result to json 28983 1726883016.14272: done dumping result, returning 28983 1726883016.14275: done running TaskExecutor() for managed_node2/TASK: Test [0affe814-3a2d-b16d-c0a7-000000000a4d] 28983 1726883016.14278: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000a4d 28983 1726883016.14574: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000a4d 28983 1726883016.14578: WORKER PROCESS EXITING 28983 1726883016.14605: no more pending results, returning what we have 28983 1726883016.14609: in VariableManager get_vars() 28983 1726883016.14646: Calling all_inventory to load vars for managed_node2 28983 1726883016.14649: Calling groups_inventory to load vars for managed_node2 28983 1726883016.14653: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883016.14663: Calling all_plugins_play to load vars for managed_node2 28983 1726883016.14667: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883016.14671: Calling groups_plugins_play to load vars for managed_node2 28983 1726883016.16848: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883016.19794: done with get_vars() 28983 1726883016.19826: variable 'ansible_search_path' from source: unknown 28983 1726883016.19827: variable 'ansible_search_path' from source: unknown 28983 1726883016.19875: we have included files to process 28983 1726883016.19876: generating all_blocks data 28983 1726883016.19879: done generating all_blocks data 28983 1726883016.19885: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml 28983 1726883016.19887: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml 28983 1726883016.19890: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml 28983 1726883016.20101: done processing included file 28983 1726883016.20104: iterating over new_blocks loaded from include file 28983 1726883016.20106: in VariableManager get_vars() 28983 1726883016.20125: done with get_vars() 28983 1726883016.20126: filtering new block on tags 28983 1726883016.20160: done filtering new block on tags 28983 1726883016.20163: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml for managed_node2 => (item=tasks/activate_profile.yml) 28983 1726883016.20169: extending task lists for all hosts with included blocks 28983 1726883016.21225: done extending task lists 28983 1726883016.21227: done processing included files 28983 1726883016.21228: results queue empty 28983 1726883016.21229: checking for any_errors_fatal 28983 1726883016.21236: done checking for any_errors_fatal 28983 1726883016.21237: checking for max_fail_percentage 28983 1726883016.21238: done checking for max_fail_percentage 28983 1726883016.21239: checking to see if all hosts have failed and the running result is not ok 28983 1726883016.21240: done checking to see if all hosts have failed 28983 1726883016.21241: getting the remaining hosts for this loop 28983 1726883016.21243: done getting the remaining hosts for this loop 28983 1726883016.21246: getting the next task for host managed_node2 28983 1726883016.21252: done getting next task for host managed_node2 28983 1726883016.21254: ^ task is: TASK: Include network role 28983 1726883016.21257: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883016.21260: getting variables 28983 1726883016.21261: in VariableManager get_vars() 28983 1726883016.21272: Calling all_inventory to load vars for managed_node2 28983 1726883016.21275: Calling groups_inventory to load vars for managed_node2 28983 1726883016.21278: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883016.21286: Calling all_plugins_play to load vars for managed_node2 28983 1726883016.21289: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883016.21293: Calling groups_plugins_play to load vars for managed_node2 28983 1726883016.23359: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883016.27502: done with get_vars() 28983 1726883016.27738: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml:3 Friday 20 September 2024 21:43:36 -0400 (0:00:00.158) 0:00:46.276 ****** 28983 1726883016.27846: entering _queue_task() for managed_node2/include_role 28983 1726883016.28598: worker is 1 (out of 1 available) 28983 1726883016.28610: exiting _queue_task() for managed_node2/include_role 28983 1726883016.28621: done queuing things up, now waiting for results queue to drain 28983 1726883016.28623: waiting for pending results... 28983 1726883016.29357: running TaskExecutor() for managed_node2/TASK: Include network role 28983 1726883016.29516: in run() - task 0affe814-3a2d-b16d-c0a7-000000000caa 28983 1726883016.29583: variable 'ansible_search_path' from source: unknown 28983 1726883016.29646: variable 'ansible_search_path' from source: unknown 28983 1726883016.29701: calling self._execute() 28983 1726883016.30106: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883016.30110: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883016.30113: variable 'omit' from source: magic vars 28983 1726883016.31074: variable 'ansible_distribution_major_version' from source: facts 28983 1726883016.31100: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883016.31112: _execute() done 28983 1726883016.31148: dumping result to json 28983 1726883016.31165: done dumping result, returning 28983 1726883016.31181: done running TaskExecutor() for managed_node2/TASK: Include network role [0affe814-3a2d-b16d-c0a7-000000000caa] 28983 1726883016.31216: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000caa 28983 1726883016.31500: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000caa 28983 1726883016.31504: WORKER PROCESS EXITING 28983 1726883016.31538: no more pending results, returning what we have 28983 1726883016.31543: in VariableManager get_vars() 28983 1726883016.31579: Calling all_inventory to load vars for managed_node2 28983 1726883016.31582: Calling groups_inventory to load vars for managed_node2 28983 1726883016.31586: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883016.31655: Calling all_plugins_play to load vars for managed_node2 28983 1726883016.31660: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883016.31669: Calling groups_plugins_play to load vars for managed_node2 28983 1726883016.34521: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883016.38740: done with get_vars() 28983 1726883016.38773: variable 'ansible_search_path' from source: unknown 28983 1726883016.38775: variable 'ansible_search_path' from source: unknown 28983 1726883016.38967: variable 'omit' from source: magic vars 28983 1726883016.39100: variable 'omit' from source: magic vars 28983 1726883016.39120: variable 'omit' from source: magic vars 28983 1726883016.39124: we have included files to process 28983 1726883016.39125: generating all_blocks data 28983 1726883016.39127: done generating all_blocks data 28983 1726883016.39129: processing included file: fedora.linux_system_roles.network 28983 1726883016.39157: in VariableManager get_vars() 28983 1726883016.39173: done with get_vars() 28983 1726883016.39206: in VariableManager get_vars() 28983 1726883016.39226: done with get_vars() 28983 1726883016.39406: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 28983 1726883016.39594: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 28983 1726883016.39708: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 28983 1726883016.40332: in VariableManager get_vars() 28983 1726883016.40360: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 28983 1726883016.42993: iterating over new_blocks loaded from include file 28983 1726883016.42995: in VariableManager get_vars() 28983 1726883016.43015: done with get_vars() 28983 1726883016.43017: filtering new block on tags 28983 1726883016.43388: done filtering new block on tags 28983 1726883016.43392: in VariableManager get_vars() 28983 1726883016.43408: done with get_vars() 28983 1726883016.43409: filtering new block on tags 28983 1726883016.43429: done filtering new block on tags 28983 1726883016.43431: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node2 28983 1726883016.43440: extending task lists for all hosts with included blocks 28983 1726883016.43595: done extending task lists 28983 1726883016.43597: done processing included files 28983 1726883016.43598: results queue empty 28983 1726883016.43599: checking for any_errors_fatal 28983 1726883016.43604: done checking for any_errors_fatal 28983 1726883016.43605: checking for max_fail_percentage 28983 1726883016.43606: done checking for max_fail_percentage 28983 1726883016.43607: checking to see if all hosts have failed and the running result is not ok 28983 1726883016.43608: done checking to see if all hosts have failed 28983 1726883016.43609: getting the remaining hosts for this loop 28983 1726883016.43611: done getting the remaining hosts for this loop 28983 1726883016.43614: getting the next task for host managed_node2 28983 1726883016.43620: done getting next task for host managed_node2 28983 1726883016.43623: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 28983 1726883016.43627: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883016.43640: getting variables 28983 1726883016.43642: in VariableManager get_vars() 28983 1726883016.43657: Calling all_inventory to load vars for managed_node2 28983 1726883016.43660: Calling groups_inventory to load vars for managed_node2 28983 1726883016.43663: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883016.43669: Calling all_plugins_play to load vars for managed_node2 28983 1726883016.43673: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883016.43677: Calling groups_plugins_play to load vars for managed_node2 28983 1726883016.45655: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883016.48635: done with get_vars() 28983 1726883016.48668: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:43:36 -0400 (0:00:00.209) 0:00:46.485 ****** 28983 1726883016.48757: entering _queue_task() for managed_node2/include_tasks 28983 1726883016.49127: worker is 1 (out of 1 available) 28983 1726883016.49144: exiting _queue_task() for managed_node2/include_tasks 28983 1726883016.49156: done queuing things up, now waiting for results queue to drain 28983 1726883016.49158: waiting for pending results... 28983 1726883016.49419: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 28983 1726883016.49626: in run() - task 0affe814-3a2d-b16d-c0a7-000000000d16 28983 1726883016.49630: variable 'ansible_search_path' from source: unknown 28983 1726883016.49633: variable 'ansible_search_path' from source: unknown 28983 1726883016.49661: calling self._execute() 28983 1726883016.49777: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883016.49794: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883016.49841: variable 'omit' from source: magic vars 28983 1726883016.50254: variable 'ansible_distribution_major_version' from source: facts 28983 1726883016.50281: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883016.50293: _execute() done 28983 1726883016.50302: dumping result to json 28983 1726883016.50310: done dumping result, returning 28983 1726883016.50389: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affe814-3a2d-b16d-c0a7-000000000d16] 28983 1726883016.50393: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000d16 28983 1726883016.50521: no more pending results, returning what we have 28983 1726883016.50526: in VariableManager get_vars() 28983 1726883016.50573: Calling all_inventory to load vars for managed_node2 28983 1726883016.50576: Calling groups_inventory to load vars for managed_node2 28983 1726883016.50579: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883016.50591: Calling all_plugins_play to load vars for managed_node2 28983 1726883016.50595: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883016.50598: Calling groups_plugins_play to load vars for managed_node2 28983 1726883016.51117: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000d16 28983 1726883016.51121: WORKER PROCESS EXITING 28983 1726883016.52986: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883016.56079: done with get_vars() 28983 1726883016.56113: variable 'ansible_search_path' from source: unknown 28983 1726883016.56114: variable 'ansible_search_path' from source: unknown 28983 1726883016.56169: we have included files to process 28983 1726883016.56173: generating all_blocks data 28983 1726883016.56176: done generating all_blocks data 28983 1726883016.56179: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 28983 1726883016.56180: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 28983 1726883016.56183: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 28983 1726883016.56973: done processing included file 28983 1726883016.56975: iterating over new_blocks loaded from include file 28983 1726883016.56977: in VariableManager get_vars() 28983 1726883016.57008: done with get_vars() 28983 1726883016.57010: filtering new block on tags 28983 1726883016.57056: done filtering new block on tags 28983 1726883016.57059: in VariableManager get_vars() 28983 1726883016.57090: done with get_vars() 28983 1726883016.57092: filtering new block on tags 28983 1726883016.57159: done filtering new block on tags 28983 1726883016.57162: in VariableManager get_vars() 28983 1726883016.57193: done with get_vars() 28983 1726883016.57195: filtering new block on tags 28983 1726883016.57258: done filtering new block on tags 28983 1726883016.57261: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node2 28983 1726883016.57267: extending task lists for all hosts with included blocks 28983 1726883016.60127: done extending task lists 28983 1726883016.60129: done processing included files 28983 1726883016.60130: results queue empty 28983 1726883016.60131: checking for any_errors_fatal 28983 1726883016.60137: done checking for any_errors_fatal 28983 1726883016.60138: checking for max_fail_percentage 28983 1726883016.60139: done checking for max_fail_percentage 28983 1726883016.60140: checking to see if all hosts have failed and the running result is not ok 28983 1726883016.60141: done checking to see if all hosts have failed 28983 1726883016.60142: getting the remaining hosts for this loop 28983 1726883016.60144: done getting the remaining hosts for this loop 28983 1726883016.60148: getting the next task for host managed_node2 28983 1726883016.60154: done getting next task for host managed_node2 28983 1726883016.60157: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 28983 1726883016.60162: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883016.60176: getting variables 28983 1726883016.60178: in VariableManager get_vars() 28983 1726883016.60193: Calling all_inventory to load vars for managed_node2 28983 1726883016.60195: Calling groups_inventory to load vars for managed_node2 28983 1726883016.60198: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883016.60204: Calling all_plugins_play to load vars for managed_node2 28983 1726883016.60323: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883016.60329: Calling groups_plugins_play to load vars for managed_node2 28983 1726883016.63023: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883016.68565: done with get_vars() 28983 1726883016.68603: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:43:36 -0400 (0:00:00.201) 0:00:46.686 ****** 28983 1726883016.68905: entering _queue_task() for managed_node2/setup 28983 1726883016.69673: worker is 1 (out of 1 available) 28983 1726883016.69685: exiting _queue_task() for managed_node2/setup 28983 1726883016.69697: done queuing things up, now waiting for results queue to drain 28983 1726883016.69699: waiting for pending results... 28983 1726883016.70129: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 28983 1726883016.70503: in run() - task 0affe814-3a2d-b16d-c0a7-000000000d6d 28983 1726883016.70519: variable 'ansible_search_path' from source: unknown 28983 1726883016.70523: variable 'ansible_search_path' from source: unknown 28983 1726883016.70563: calling self._execute() 28983 1726883016.70876: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883016.70882: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883016.70894: variable 'omit' from source: magic vars 28983 1726883016.71493: variable 'ansible_distribution_major_version' from source: facts 28983 1726883016.71505: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883016.71821: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883016.74448: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883016.74486: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883016.74536: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883016.74585: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883016.74625: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883016.74725: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883016.74768: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883016.74805: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883016.74867: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883016.74891: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883016.74967: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883016.75032: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883016.75044: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883016.75099: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883016.75123: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883016.75360: variable '__network_required_facts' from source: role '' defaults 28983 1726883016.75363: variable 'ansible_facts' from source: unknown 28983 1726883016.77841: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 28983 1726883016.77845: when evaluation is False, skipping this task 28983 1726883016.77849: _execute() done 28983 1726883016.77852: dumping result to json 28983 1726883016.77856: done dumping result, returning 28983 1726883016.77864: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affe814-3a2d-b16d-c0a7-000000000d6d] 28983 1726883016.77866: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000d6d 28983 1726883016.77940: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000d6d 28983 1726883016.77944: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28983 1726883016.78024: no more pending results, returning what we have 28983 1726883016.78029: results queue empty 28983 1726883016.78030: checking for any_errors_fatal 28983 1726883016.78032: done checking for any_errors_fatal 28983 1726883016.78035: checking for max_fail_percentage 28983 1726883016.78040: done checking for max_fail_percentage 28983 1726883016.78041: checking to see if all hosts have failed and the running result is not ok 28983 1726883016.78042: done checking to see if all hosts have failed 28983 1726883016.78043: getting the remaining hosts for this loop 28983 1726883016.78045: done getting the remaining hosts for this loop 28983 1726883016.78052: getting the next task for host managed_node2 28983 1726883016.78065: done getting next task for host managed_node2 28983 1726883016.78070: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 28983 1726883016.78082: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883016.78109: getting variables 28983 1726883016.78111: in VariableManager get_vars() 28983 1726883016.78355: Calling all_inventory to load vars for managed_node2 28983 1726883016.78358: Calling groups_inventory to load vars for managed_node2 28983 1726883016.78361: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883016.78374: Calling all_plugins_play to load vars for managed_node2 28983 1726883016.78378: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883016.78388: Calling groups_plugins_play to load vars for managed_node2 28983 1726883016.83552: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883016.95890: done with get_vars() 28983 1726883016.95935: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:43:36 -0400 (0:00:00.271) 0:00:46.958 ****** 28983 1726883016.96053: entering _queue_task() for managed_node2/stat 28983 1726883016.96489: worker is 1 (out of 1 available) 28983 1726883016.96503: exiting _queue_task() for managed_node2/stat 28983 1726883016.96518: done queuing things up, now waiting for results queue to drain 28983 1726883016.96520: waiting for pending results... 28983 1726883016.97253: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 28983 1726883016.97663: in run() - task 0affe814-3a2d-b16d-c0a7-000000000d6f 28983 1726883016.97668: variable 'ansible_search_path' from source: unknown 28983 1726883016.97671: variable 'ansible_search_path' from source: unknown 28983 1726883016.97674: calling self._execute() 28983 1726883016.97908: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883016.97923: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883016.97944: variable 'omit' from source: magic vars 28983 1726883016.98802: variable 'ansible_distribution_major_version' from source: facts 28983 1726883016.98887: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883016.99216: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883017.00033: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883017.00440: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883017.00445: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883017.00448: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883017.00611: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883017.00760: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883017.00832: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883017.01079: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883017.01208: variable '__network_is_ostree' from source: set_fact 28983 1726883017.01224: Evaluated conditional (not __network_is_ostree is defined): False 28983 1726883017.01359: when evaluation is False, skipping this task 28983 1726883017.01375: _execute() done 28983 1726883017.01385: dumping result to json 28983 1726883017.01394: done dumping result, returning 28983 1726883017.01407: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affe814-3a2d-b16d-c0a7-000000000d6f] 28983 1726883017.01418: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000d6f skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 28983 1726883017.01776: no more pending results, returning what we have 28983 1726883017.01781: results queue empty 28983 1726883017.01782: checking for any_errors_fatal 28983 1726883017.01791: done checking for any_errors_fatal 28983 1726883017.01792: checking for max_fail_percentage 28983 1726883017.01794: done checking for max_fail_percentage 28983 1726883017.01795: checking to see if all hosts have failed and the running result is not ok 28983 1726883017.01796: done checking to see if all hosts have failed 28983 1726883017.01797: getting the remaining hosts for this loop 28983 1726883017.01799: done getting the remaining hosts for this loop 28983 1726883017.01804: getting the next task for host managed_node2 28983 1726883017.01814: done getting next task for host managed_node2 28983 1726883017.01818: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 28983 1726883017.01825: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883017.01851: getting variables 28983 1726883017.01853: in VariableManager get_vars() 28983 1726883017.01902: Calling all_inventory to load vars for managed_node2 28983 1726883017.01906: Calling groups_inventory to load vars for managed_node2 28983 1726883017.01909: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883017.01922: Calling all_plugins_play to load vars for managed_node2 28983 1726883017.01926: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883017.01930: Calling groups_plugins_play to load vars for managed_node2 28983 1726883017.02641: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000d6f 28983 1726883017.02645: WORKER PROCESS EXITING 28983 1726883017.06765: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883017.11689: done with get_vars() 28983 1726883017.11755: done getting variables 28983 1726883017.11858: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:43:37 -0400 (0:00:00.158) 0:00:47.116 ****** 28983 1726883017.11919: entering _queue_task() for managed_node2/set_fact 28983 1726883017.12701: worker is 1 (out of 1 available) 28983 1726883017.12714: exiting _queue_task() for managed_node2/set_fact 28983 1726883017.12729: done queuing things up, now waiting for results queue to drain 28983 1726883017.12731: waiting for pending results... 28983 1726883017.13295: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 28983 1726883017.13573: in run() - task 0affe814-3a2d-b16d-c0a7-000000000d70 28983 1726883017.13593: variable 'ansible_search_path' from source: unknown 28983 1726883017.13597: variable 'ansible_search_path' from source: unknown 28983 1726883017.13639: calling self._execute() 28983 1726883017.13769: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883017.13785: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883017.13801: variable 'omit' from source: magic vars 28983 1726883017.14295: variable 'ansible_distribution_major_version' from source: facts 28983 1726883017.14314: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883017.14546: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883017.14901: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883017.14953: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883017.15040: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883017.15091: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883017.15241: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883017.15246: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883017.15249: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883017.15288: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883017.15437: variable '__network_is_ostree' from source: set_fact 28983 1726883017.15467: Evaluated conditional (not __network_is_ostree is defined): False 28983 1726883017.15471: when evaluation is False, skipping this task 28983 1726883017.15474: _execute() done 28983 1726883017.15481: dumping result to json 28983 1726883017.15488: done dumping result, returning 28983 1726883017.15499: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affe814-3a2d-b16d-c0a7-000000000d70] 28983 1726883017.15615: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000d70 28983 1726883017.15973: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000d70 28983 1726883017.15977: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 28983 1726883017.16098: no more pending results, returning what we have 28983 1726883017.16102: results queue empty 28983 1726883017.16103: checking for any_errors_fatal 28983 1726883017.16108: done checking for any_errors_fatal 28983 1726883017.16109: checking for max_fail_percentage 28983 1726883017.16111: done checking for max_fail_percentage 28983 1726883017.16112: checking to see if all hosts have failed and the running result is not ok 28983 1726883017.16113: done checking to see if all hosts have failed 28983 1726883017.16114: getting the remaining hosts for this loop 28983 1726883017.16116: done getting the remaining hosts for this loop 28983 1726883017.16120: getting the next task for host managed_node2 28983 1726883017.16132: done getting next task for host managed_node2 28983 1726883017.16139: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 28983 1726883017.16151: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883017.16174: getting variables 28983 1726883017.16175: in VariableManager get_vars() 28983 1726883017.16215: Calling all_inventory to load vars for managed_node2 28983 1726883017.16218: Calling groups_inventory to load vars for managed_node2 28983 1726883017.16221: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883017.16231: Calling all_plugins_play to load vars for managed_node2 28983 1726883017.16565: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883017.16573: Calling groups_plugins_play to load vars for managed_node2 28983 1726883017.19766: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883017.24502: done with get_vars() 28983 1726883017.24544: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:43:37 -0400 (0:00:00.127) 0:00:47.244 ****** 28983 1726883017.24676: entering _queue_task() for managed_node2/service_facts 28983 1726883017.25208: worker is 1 (out of 1 available) 28983 1726883017.25226: exiting _queue_task() for managed_node2/service_facts 28983 1726883017.25241: done queuing things up, now waiting for results queue to drain 28983 1726883017.25243: waiting for pending results... 28983 1726883017.25514: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running 28983 1726883017.25722: in run() - task 0affe814-3a2d-b16d-c0a7-000000000d72 28983 1726883017.25738: variable 'ansible_search_path' from source: unknown 28983 1726883017.25743: variable 'ansible_search_path' from source: unknown 28983 1726883017.25794: calling self._execute() 28983 1726883017.25910: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883017.25918: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883017.25931: variable 'omit' from source: magic vars 28983 1726883017.26417: variable 'ansible_distribution_major_version' from source: facts 28983 1726883017.26440: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883017.26447: variable 'omit' from source: magic vars 28983 1726883017.26640: variable 'omit' from source: magic vars 28983 1726883017.26645: variable 'omit' from source: magic vars 28983 1726883017.26661: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883017.26710: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883017.26736: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883017.26767: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883017.26783: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883017.26820: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883017.26823: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883017.26828: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883017.27140: Set connection var ansible_connection to ssh 28983 1726883017.27143: Set connection var ansible_shell_executable to /bin/sh 28983 1726883017.27146: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883017.27148: Set connection var ansible_timeout to 10 28983 1726883017.27151: Set connection var ansible_pipelining to False 28983 1726883017.27153: Set connection var ansible_shell_type to sh 28983 1726883017.27155: variable 'ansible_shell_executable' from source: unknown 28983 1726883017.27158: variable 'ansible_connection' from source: unknown 28983 1726883017.27161: variable 'ansible_module_compression' from source: unknown 28983 1726883017.27163: variable 'ansible_shell_type' from source: unknown 28983 1726883017.27166: variable 'ansible_shell_executable' from source: unknown 28983 1726883017.27168: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883017.27170: variable 'ansible_pipelining' from source: unknown 28983 1726883017.27173: variable 'ansible_timeout' from source: unknown 28983 1726883017.27175: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883017.27423: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28983 1726883017.27639: variable 'omit' from source: magic vars 28983 1726883017.27642: starting attempt loop 28983 1726883017.27645: running the handler 28983 1726883017.27647: _low_level_execute_command(): starting 28983 1726883017.27649: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726883017.28282: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883017.28382: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883017.28423: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883017.28444: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883017.28467: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883017.28637: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883017.30382: stdout chunk (state=3): >>>/root <<< 28983 1726883017.30639: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883017.30643: stdout chunk (state=3): >>><<< 28983 1726883017.30646: stderr chunk (state=3): >>><<< 28983 1726883017.30650: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883017.30653: _low_level_execute_command(): starting 28983 1726883017.30657: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726883017.3061543-30673-186819003943727 `" && echo ansible-tmp-1726883017.3061543-30673-186819003943727="` echo /root/.ansible/tmp/ansible-tmp-1726883017.3061543-30673-186819003943727 `" ) && sleep 0' 28983 1726883017.31309: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883017.31318: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883017.31330: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883017.31348: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883017.31362: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726883017.31370: stderr chunk (state=3): >>>debug2: match not found <<< 28983 1726883017.31384: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883017.31407: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28983 1726883017.31540: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.46.139 is address <<< 28983 1726883017.31544: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28983 1726883017.31547: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883017.31549: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883017.31552: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883017.31554: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883017.31575: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883017.31674: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883017.33756: stdout chunk (state=3): >>>ansible-tmp-1726883017.3061543-30673-186819003943727=/root/.ansible/tmp/ansible-tmp-1726883017.3061543-30673-186819003943727 <<< 28983 1726883017.33954: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883017.33973: stdout chunk (state=3): >>><<< 28983 1726883017.33988: stderr chunk (state=3): >>><<< 28983 1726883017.34140: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726883017.3061543-30673-186819003943727=/root/.ansible/tmp/ansible-tmp-1726883017.3061543-30673-186819003943727 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883017.34144: variable 'ansible_module_compression' from source: unknown 28983 1726883017.34147: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 28983 1726883017.34191: variable 'ansible_facts' from source: unknown 28983 1726883017.34296: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726883017.3061543-30673-186819003943727/AnsiballZ_service_facts.py 28983 1726883017.34515: Sending initial data 28983 1726883017.34519: Sent initial data (162 bytes) 28983 1726883017.35258: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883017.35327: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883017.35351: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883017.35386: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883017.35498: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883017.37232: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726883017.37316: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726883017.37374: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpgrj8ahtg /root/.ansible/tmp/ansible-tmp-1726883017.3061543-30673-186819003943727/AnsiballZ_service_facts.py <<< 28983 1726883017.37410: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726883017.3061543-30673-186819003943727/AnsiballZ_service_facts.py" <<< 28983 1726883017.37474: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 28983 1726883017.37504: stderr chunk (state=3): >>>debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpgrj8ahtg" to remote "/root/.ansible/tmp/ansible-tmp-1726883017.3061543-30673-186819003943727/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726883017.3061543-30673-186819003943727/AnsiballZ_service_facts.py" <<< 28983 1726883017.38962: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883017.38997: stderr chunk (state=3): >>><<< 28983 1726883017.39007: stdout chunk (state=3): >>><<< 28983 1726883017.39037: done transferring module to remote 28983 1726883017.39079: _low_level_execute_command(): starting 28983 1726883017.39082: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726883017.3061543-30673-186819003943727/ /root/.ansible/tmp/ansible-tmp-1726883017.3061543-30673-186819003943727/AnsiballZ_service_facts.py && sleep 0' 28983 1726883017.39844: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883017.39895: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883017.39968: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883017.41986: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883017.42000: stderr chunk (state=3): >>><<< 28983 1726883017.42009: stdout chunk (state=3): >>><<< 28983 1726883017.42031: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883017.42043: _low_level_execute_command(): starting 28983 1726883017.42053: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726883017.3061543-30673-186819003943727/AnsiballZ_service_facts.py && sleep 0' 28983 1726883017.42702: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883017.42719: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883017.42738: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883017.42759: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883017.42778: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726883017.42804: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883017.42915: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883017.42939: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883017.43056: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883020.42716: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"network": {"name": "network", "state": "running", "status": "enabled", "source": "sysv"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "import-state.service": {"name": "import-state.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "ip6tables.service": {"name": "ip6tables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "iptables.service": {"name": "iptables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "generated", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "stat<<< 28983 1726883020.42731: stdout chunk (state=3): >>>ic", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "loadmodules.service": {"name": "loadmodules.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 28983 1726883020.44344: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. <<< 28983 1726883020.44403: stderr chunk (state=3): >>><<< 28983 1726883020.44407: stdout chunk (state=3): >>><<< 28983 1726883020.44436: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"network": {"name": "network", "state": "running", "status": "enabled", "source": "sysv"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "import-state.service": {"name": "import-state.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "ip6tables.service": {"name": "ip6tables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "iptables.service": {"name": "iptables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "generated", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "loadmodules.service": {"name": "loadmodules.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726883020.45144: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726883017.3061543-30673-186819003943727/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726883020.45157: _low_level_execute_command(): starting 28983 1726883020.45169: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726883017.3061543-30673-186819003943727/ > /dev/null 2>&1 && sleep 0' 28983 1726883020.45779: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883020.45783: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883020.45795: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883020.45810: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883020.45840: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726883020.45844: stderr chunk (state=3): >>>debug2: match not found <<< 28983 1726883020.45847: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883020.45859: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28983 1726883020.45882: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.46.139 is address <<< 28983 1726883020.45896: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28983 1726883020.45900: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883020.45902: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883020.45905: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883020.46025: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726883020.46028: stderr chunk (state=3): >>>debug2: match found <<< 28983 1726883020.46031: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883020.46033: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883020.46074: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883020.46127: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883020.46212: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883020.48249: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883020.48266: stderr chunk (state=3): >>><<< 28983 1726883020.48270: stdout chunk (state=3): >>><<< 28983 1726883020.48305: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883020.48309: handler run complete 28983 1726883020.48740: variable 'ansible_facts' from source: unknown 28983 1726883020.48788: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883020.49591: variable 'ansible_facts' from source: unknown 28983 1726883020.49821: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883020.50316: attempt loop complete, returning result 28983 1726883020.50320: _execute() done 28983 1726883020.50412: dumping result to json 28983 1726883020.50507: done dumping result, returning 28983 1726883020.50511: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running [0affe814-3a2d-b16d-c0a7-000000000d72] 28983 1726883020.50513: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000d72 28983 1726883020.52425: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000d72 28983 1726883020.52428: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28983 1726883020.52573: no more pending results, returning what we have 28983 1726883020.52577: results queue empty 28983 1726883020.52578: checking for any_errors_fatal 28983 1726883020.52586: done checking for any_errors_fatal 28983 1726883020.52587: checking for max_fail_percentage 28983 1726883020.52589: done checking for max_fail_percentage 28983 1726883020.52590: checking to see if all hosts have failed and the running result is not ok 28983 1726883020.52591: done checking to see if all hosts have failed 28983 1726883020.52592: getting the remaining hosts for this loop 28983 1726883020.52594: done getting the remaining hosts for this loop 28983 1726883020.52598: getting the next task for host managed_node2 28983 1726883020.52605: done getting next task for host managed_node2 28983 1726883020.52609: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 28983 1726883020.52617: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883020.52632: getting variables 28983 1726883020.52636: in VariableManager get_vars() 28983 1726883020.52669: Calling all_inventory to load vars for managed_node2 28983 1726883020.52673: Calling groups_inventory to load vars for managed_node2 28983 1726883020.52675: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883020.52686: Calling all_plugins_play to load vars for managed_node2 28983 1726883020.52690: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883020.52694: Calling groups_plugins_play to load vars for managed_node2 28983 1726883020.55216: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883020.58487: done with get_vars() 28983 1726883020.58525: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:43:40 -0400 (0:00:03.339) 0:00:50.584 ****** 28983 1726883020.58655: entering _queue_task() for managed_node2/package_facts 28983 1726883020.59014: worker is 1 (out of 1 available) 28983 1726883020.59029: exiting _queue_task() for managed_node2/package_facts 28983 1726883020.59047: done queuing things up, now waiting for results queue to drain 28983 1726883020.59049: waiting for pending results... 28983 1726883020.59364: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 28983 1726883020.59676: in run() - task 0affe814-3a2d-b16d-c0a7-000000000d73 28983 1726883020.59681: variable 'ansible_search_path' from source: unknown 28983 1726883020.59685: variable 'ansible_search_path' from source: unknown 28983 1726883020.59689: calling self._execute() 28983 1726883020.59813: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883020.59830: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883020.59856: variable 'omit' from source: magic vars 28983 1726883020.60362: variable 'ansible_distribution_major_version' from source: facts 28983 1726883020.60388: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883020.60440: variable 'omit' from source: magic vars 28983 1726883020.60532: variable 'omit' from source: magic vars 28983 1726883020.60585: variable 'omit' from source: magic vars 28983 1726883020.60645: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883020.60699: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883020.60744: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883020.60877: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883020.60881: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883020.60884: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883020.60887: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883020.60890: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883020.60982: Set connection var ansible_connection to ssh 28983 1726883020.61010: Set connection var ansible_shell_executable to /bin/sh 28983 1726883020.61026: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883020.61043: Set connection var ansible_timeout to 10 28983 1726883020.61055: Set connection var ansible_pipelining to False 28983 1726883020.61061: Set connection var ansible_shell_type to sh 28983 1726883020.61088: variable 'ansible_shell_executable' from source: unknown 28983 1726883020.61096: variable 'ansible_connection' from source: unknown 28983 1726883020.61103: variable 'ansible_module_compression' from source: unknown 28983 1726883020.61117: variable 'ansible_shell_type' from source: unknown 28983 1726883020.61125: variable 'ansible_shell_executable' from source: unknown 28983 1726883020.61131: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883020.61224: variable 'ansible_pipelining' from source: unknown 28983 1726883020.61227: variable 'ansible_timeout' from source: unknown 28983 1726883020.61229: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883020.61388: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28983 1726883020.61406: variable 'omit' from source: magic vars 28983 1726883020.61416: starting attempt loop 28983 1726883020.61423: running the handler 28983 1726883020.61450: _low_level_execute_command(): starting 28983 1726883020.61463: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726883020.62338: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883020.62344: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883020.62373: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883020.62491: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883020.64273: stdout chunk (state=3): >>>/root <<< 28983 1726883020.64383: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883020.64432: stderr chunk (state=3): >>><<< 28983 1726883020.64439: stdout chunk (state=3): >>><<< 28983 1726883020.64458: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883020.64470: _low_level_execute_command(): starting 28983 1726883020.64477: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726883020.6445837-30762-166002495481362 `" && echo ansible-tmp-1726883020.6445837-30762-166002495481362="` echo /root/.ansible/tmp/ansible-tmp-1726883020.6445837-30762-166002495481362 `" ) && sleep 0' 28983 1726883020.64912: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883020.64916: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726883020.64919: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration <<< 28983 1726883020.64922: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found <<< 28983 1726883020.64933: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883020.64978: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883020.64982: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883020.65065: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883020.67085: stdout chunk (state=3): >>>ansible-tmp-1726883020.6445837-30762-166002495481362=/root/.ansible/tmp/ansible-tmp-1726883020.6445837-30762-166002495481362 <<< 28983 1726883020.67205: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883020.67250: stderr chunk (state=3): >>><<< 28983 1726883020.67253: stdout chunk (state=3): >>><<< 28983 1726883020.67269: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726883020.6445837-30762-166002495481362=/root/.ansible/tmp/ansible-tmp-1726883020.6445837-30762-166002495481362 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883020.67307: variable 'ansible_module_compression' from source: unknown 28983 1726883020.67347: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 28983 1726883020.67404: variable 'ansible_facts' from source: unknown 28983 1726883020.67543: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726883020.6445837-30762-166002495481362/AnsiballZ_package_facts.py 28983 1726883020.67660: Sending initial data 28983 1726883020.67663: Sent initial data (162 bytes) 28983 1726883020.68081: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883020.68086: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883020.68100: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883020.68151: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883020.68166: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883020.68239: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883020.69900: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 28983 1726883020.69904: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726883020.69967: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726883020.70030: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpzm9vfkmw /root/.ansible/tmp/ansible-tmp-1726883020.6445837-30762-166002495481362/AnsiballZ_package_facts.py <<< 28983 1726883020.70037: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726883020.6445837-30762-166002495481362/AnsiballZ_package_facts.py" <<< 28983 1726883020.70099: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpzm9vfkmw" to remote "/root/.ansible/tmp/ansible-tmp-1726883020.6445837-30762-166002495481362/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726883020.6445837-30762-166002495481362/AnsiballZ_package_facts.py" <<< 28983 1726883020.71978: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883020.72239: stderr chunk (state=3): >>><<< 28983 1726883020.72243: stdout chunk (state=3): >>><<< 28983 1726883020.72246: done transferring module to remote 28983 1726883020.72248: _low_level_execute_command(): starting 28983 1726883020.72251: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726883020.6445837-30762-166002495481362/ /root/.ansible/tmp/ansible-tmp-1726883020.6445837-30762-166002495481362/AnsiballZ_package_facts.py && sleep 0' 28983 1726883020.72638: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883020.72653: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883020.72664: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883020.72683: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883020.72695: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726883020.72705: stderr chunk (state=3): >>>debug2: match not found <<< 28983 1726883020.72713: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883020.72729: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28983 1726883020.72741: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.46.139 is address <<< 28983 1726883020.72749: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28983 1726883020.72758: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883020.72768: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883020.72789: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883020.72804: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726883020.72813: stderr chunk (state=3): >>>debug2: match found <<< 28983 1726883020.72820: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883020.72903: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883020.72918: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883020.72938: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883020.73026: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883020.74941: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883020.74983: stderr chunk (state=3): >>><<< 28983 1726883020.74987: stdout chunk (state=3): >>><<< 28983 1726883020.74998: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883020.75001: _low_level_execute_command(): starting 28983 1726883020.75009: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726883020.6445837-30762-166002495481362/AnsiballZ_package_facts.py && sleep 0' 28983 1726883020.75396: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883020.75428: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726883020.75431: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883020.75437: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration <<< 28983 1726883020.75441: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883020.75494: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883020.75499: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883020.75577: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883021.39542: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "a<<< 28983 1726883021.39562: stdout chunk (state=3): >>>rch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-<<< 28983 1726883021.39585: stdout chunk (state=3): >>>libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, <<< 28983 1726883021.39591: stdout chunk (state=3): >>>"arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.fc3<<< 28983 1726883021.39598: stdout chunk (state=3): >>>9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", <<< 28983 1726883021.39605: stdout chunk (state=3): >>>"release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch": "n<<< 28983 1726883021.39609: stdout chunk (state=3): >>>oarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": null, "a<<< 28983 1726883021.39613: stdout chunk (state=3): >>>rch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "9.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chkconfig": [{"name": "chkconfig", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts": [{"name": "initscripts", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "network-scripts": [{"name": "network-scripts", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 28983 1726883021.41497: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. <<< 28983 1726883021.41554: stderr chunk (state=3): >>><<< 28983 1726883021.41557: stdout chunk (state=3): >>><<< 28983 1726883021.41601: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "9.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chkconfig": [{"name": "chkconfig", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts": [{"name": "initscripts", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "network-scripts": [{"name": "network-scripts", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726883021.43886: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726883020.6445837-30762-166002495481362/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726883021.43904: _low_level_execute_command(): starting 28983 1726883021.43910: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726883020.6445837-30762-166002495481362/ > /dev/null 2>&1 && sleep 0' 28983 1726883021.44385: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883021.44388: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726883021.44391: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 28983 1726883021.44393: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883021.44396: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883021.44447: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883021.44451: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883021.44527: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883021.46473: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883021.46524: stderr chunk (state=3): >>><<< 28983 1726883021.46527: stdout chunk (state=3): >>><<< 28983 1726883021.46544: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883021.46552: handler run complete 28983 1726883021.47368: variable 'ansible_facts' from source: unknown 28983 1726883021.47888: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883021.49848: variable 'ansible_facts' from source: unknown 28983 1726883021.50288: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883021.51059: attempt loop complete, returning result 28983 1726883021.51078: _execute() done 28983 1726883021.51081: dumping result to json 28983 1726883021.51266: done dumping result, returning 28983 1726883021.51276: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affe814-3a2d-b16d-c0a7-000000000d73] 28983 1726883021.51281: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000d73 ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28983 1726883021.53573: no more pending results, returning what we have 28983 1726883021.53576: results queue empty 28983 1726883021.53576: checking for any_errors_fatal 28983 1726883021.53580: done checking for any_errors_fatal 28983 1726883021.53581: checking for max_fail_percentage 28983 1726883021.53582: done checking for max_fail_percentage 28983 1726883021.53583: checking to see if all hosts have failed and the running result is not ok 28983 1726883021.53583: done checking to see if all hosts have failed 28983 1726883021.53584: getting the remaining hosts for this loop 28983 1726883021.53585: done getting the remaining hosts for this loop 28983 1726883021.53588: getting the next task for host managed_node2 28983 1726883021.53594: done getting next task for host managed_node2 28983 1726883021.53597: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 28983 1726883021.53602: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883021.53616: getting variables 28983 1726883021.53617: in VariableManager get_vars() 28983 1726883021.53643: Calling all_inventory to load vars for managed_node2 28983 1726883021.53646: Calling groups_inventory to load vars for managed_node2 28983 1726883021.53647: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883021.53655: Calling all_plugins_play to load vars for managed_node2 28983 1726883021.53658: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883021.53661: Calling groups_plugins_play to load vars for managed_node2 28983 1726883021.54179: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000d73 28983 1726883021.54183: WORKER PROCESS EXITING 28983 1726883021.54810: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883021.56394: done with get_vars() 28983 1726883021.56418: done getting variables 28983 1726883021.56468: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:43:41 -0400 (0:00:00.978) 0:00:51.562 ****** 28983 1726883021.56504: entering _queue_task() for managed_node2/debug 28983 1726883021.56749: worker is 1 (out of 1 available) 28983 1726883021.56764: exiting _queue_task() for managed_node2/debug 28983 1726883021.56779: done queuing things up, now waiting for results queue to drain 28983 1726883021.56781: waiting for pending results... 28983 1726883021.56977: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 28983 1726883021.57090: in run() - task 0affe814-3a2d-b16d-c0a7-000000000d17 28983 1726883021.57104: variable 'ansible_search_path' from source: unknown 28983 1726883021.57107: variable 'ansible_search_path' from source: unknown 28983 1726883021.57144: calling self._execute() 28983 1726883021.57231: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883021.57342: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883021.57346: variable 'omit' from source: magic vars 28983 1726883021.57568: variable 'ansible_distribution_major_version' from source: facts 28983 1726883021.57580: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883021.57586: variable 'omit' from source: magic vars 28983 1726883021.57641: variable 'omit' from source: magic vars 28983 1726883021.57721: variable 'network_provider' from source: set_fact 28983 1726883021.57738: variable 'omit' from source: magic vars 28983 1726883021.57781: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883021.57810: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883021.57829: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883021.57847: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883021.57857: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883021.57887: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883021.57891: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883021.57894: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883021.57975: Set connection var ansible_connection to ssh 28983 1726883021.57984: Set connection var ansible_shell_executable to /bin/sh 28983 1726883021.57994: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883021.58004: Set connection var ansible_timeout to 10 28983 1726883021.58011: Set connection var ansible_pipelining to False 28983 1726883021.58014: Set connection var ansible_shell_type to sh 28983 1726883021.58032: variable 'ansible_shell_executable' from source: unknown 28983 1726883021.58037: variable 'ansible_connection' from source: unknown 28983 1726883021.58040: variable 'ansible_module_compression' from source: unknown 28983 1726883021.58044: variable 'ansible_shell_type' from source: unknown 28983 1726883021.58048: variable 'ansible_shell_executable' from source: unknown 28983 1726883021.58052: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883021.58057: variable 'ansible_pipelining' from source: unknown 28983 1726883021.58061: variable 'ansible_timeout' from source: unknown 28983 1726883021.58066: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883021.58184: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883021.58193: variable 'omit' from source: magic vars 28983 1726883021.58199: starting attempt loop 28983 1726883021.58203: running the handler 28983 1726883021.58246: handler run complete 28983 1726883021.58259: attempt loop complete, returning result 28983 1726883021.58262: _execute() done 28983 1726883021.58264: dumping result to json 28983 1726883021.58270: done dumping result, returning 28983 1726883021.58278: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [0affe814-3a2d-b16d-c0a7-000000000d17] 28983 1726883021.58284: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000d17 28983 1726883021.58380: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000d17 28983 1726883021.58383: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: Using network provider: nm 28983 1726883021.58452: no more pending results, returning what we have 28983 1726883021.58456: results queue empty 28983 1726883021.58457: checking for any_errors_fatal 28983 1726883021.58464: done checking for any_errors_fatal 28983 1726883021.58465: checking for max_fail_percentage 28983 1726883021.58466: done checking for max_fail_percentage 28983 1726883021.58467: checking to see if all hosts have failed and the running result is not ok 28983 1726883021.58468: done checking to see if all hosts have failed 28983 1726883021.58469: getting the remaining hosts for this loop 28983 1726883021.58473: done getting the remaining hosts for this loop 28983 1726883021.58477: getting the next task for host managed_node2 28983 1726883021.58485: done getting next task for host managed_node2 28983 1726883021.58490: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 28983 1726883021.58496: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883021.58506: getting variables 28983 1726883021.58508: in VariableManager get_vars() 28983 1726883021.58547: Calling all_inventory to load vars for managed_node2 28983 1726883021.58550: Calling groups_inventory to load vars for managed_node2 28983 1726883021.58553: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883021.58561: Calling all_plugins_play to load vars for managed_node2 28983 1726883021.58565: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883021.58568: Calling groups_plugins_play to load vars for managed_node2 28983 1726883021.59825: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883021.61407: done with get_vars() 28983 1726883021.61429: done getting variables 28983 1726883021.61478: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:43:41 -0400 (0:00:00.050) 0:00:51.612 ****** 28983 1726883021.61509: entering _queue_task() for managed_node2/fail 28983 1726883021.61729: worker is 1 (out of 1 available) 28983 1726883021.61746: exiting _queue_task() for managed_node2/fail 28983 1726883021.61758: done queuing things up, now waiting for results queue to drain 28983 1726883021.61759: waiting for pending results... 28983 1726883021.61938: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 28983 1726883021.62051: in run() - task 0affe814-3a2d-b16d-c0a7-000000000d18 28983 1726883021.62065: variable 'ansible_search_path' from source: unknown 28983 1726883021.62068: variable 'ansible_search_path' from source: unknown 28983 1726883021.62100: calling self._execute() 28983 1726883021.62178: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883021.62183: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883021.62196: variable 'omit' from source: magic vars 28983 1726883021.62500: variable 'ansible_distribution_major_version' from source: facts 28983 1726883021.62510: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883021.62614: variable 'network_state' from source: role '' defaults 28983 1726883021.62624: Evaluated conditional (network_state != {}): False 28983 1726883021.62627: when evaluation is False, skipping this task 28983 1726883021.62630: _execute() done 28983 1726883021.62636: dumping result to json 28983 1726883021.62642: done dumping result, returning 28983 1726883021.62652: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affe814-3a2d-b16d-c0a7-000000000d18] 28983 1726883021.62655: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000d18 28983 1726883021.62752: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000d18 28983 1726883021.62755: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28983 1726883021.62810: no more pending results, returning what we have 28983 1726883021.62814: results queue empty 28983 1726883021.62815: checking for any_errors_fatal 28983 1726883021.62819: done checking for any_errors_fatal 28983 1726883021.62820: checking for max_fail_percentage 28983 1726883021.62822: done checking for max_fail_percentage 28983 1726883021.62823: checking to see if all hosts have failed and the running result is not ok 28983 1726883021.62824: done checking to see if all hosts have failed 28983 1726883021.62825: getting the remaining hosts for this loop 28983 1726883021.62827: done getting the remaining hosts for this loop 28983 1726883021.62830: getting the next task for host managed_node2 28983 1726883021.62839: done getting next task for host managed_node2 28983 1726883021.62843: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 28983 1726883021.62849: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883021.62867: getting variables 28983 1726883021.62869: in VariableManager get_vars() 28983 1726883021.62904: Calling all_inventory to load vars for managed_node2 28983 1726883021.62908: Calling groups_inventory to load vars for managed_node2 28983 1726883021.62911: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883021.62918: Calling all_plugins_play to load vars for managed_node2 28983 1726883021.62921: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883021.62923: Calling groups_plugins_play to load vars for managed_node2 28983 1726883021.64090: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883021.65755: done with get_vars() 28983 1726883021.65779: done getting variables 28983 1726883021.65825: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:43:41 -0400 (0:00:00.043) 0:00:51.656 ****** 28983 1726883021.65854: entering _queue_task() for managed_node2/fail 28983 1726883021.66060: worker is 1 (out of 1 available) 28983 1726883021.66077: exiting _queue_task() for managed_node2/fail 28983 1726883021.66090: done queuing things up, now waiting for results queue to drain 28983 1726883021.66092: waiting for pending results... 28983 1726883021.66265: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 28983 1726883021.66368: in run() - task 0affe814-3a2d-b16d-c0a7-000000000d19 28983 1726883021.66381: variable 'ansible_search_path' from source: unknown 28983 1726883021.66385: variable 'ansible_search_path' from source: unknown 28983 1726883021.66415: calling self._execute() 28983 1726883021.66497: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883021.66502: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883021.66514: variable 'omit' from source: magic vars 28983 1726883021.66812: variable 'ansible_distribution_major_version' from source: facts 28983 1726883021.66823: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883021.66925: variable 'network_state' from source: role '' defaults 28983 1726883021.66936: Evaluated conditional (network_state != {}): False 28983 1726883021.66939: when evaluation is False, skipping this task 28983 1726883021.66942: _execute() done 28983 1726883021.66946: dumping result to json 28983 1726883021.66951: done dumping result, returning 28983 1726883021.66958: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affe814-3a2d-b16d-c0a7-000000000d19] 28983 1726883021.66967: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000d19 28983 1726883021.67065: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000d19 28983 1726883021.67068: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28983 1726883021.67123: no more pending results, returning what we have 28983 1726883021.67127: results queue empty 28983 1726883021.67128: checking for any_errors_fatal 28983 1726883021.67136: done checking for any_errors_fatal 28983 1726883021.67137: checking for max_fail_percentage 28983 1726883021.67139: done checking for max_fail_percentage 28983 1726883021.67140: checking to see if all hosts have failed and the running result is not ok 28983 1726883021.67141: done checking to see if all hosts have failed 28983 1726883021.67142: getting the remaining hosts for this loop 28983 1726883021.67143: done getting the remaining hosts for this loop 28983 1726883021.67147: getting the next task for host managed_node2 28983 1726883021.67154: done getting next task for host managed_node2 28983 1726883021.67158: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 28983 1726883021.67163: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883021.67185: getting variables 28983 1726883021.67186: in VariableManager get_vars() 28983 1726883021.67218: Calling all_inventory to load vars for managed_node2 28983 1726883021.67220: Calling groups_inventory to load vars for managed_node2 28983 1726883021.67222: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883021.67228: Calling all_plugins_play to load vars for managed_node2 28983 1726883021.67230: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883021.67233: Calling groups_plugins_play to load vars for managed_node2 28983 1726883021.68404: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883021.69989: done with get_vars() 28983 1726883021.70011: done getting variables 28983 1726883021.70058: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:43:41 -0400 (0:00:00.042) 0:00:51.698 ****** 28983 1726883021.70087: entering _queue_task() for managed_node2/fail 28983 1726883021.70296: worker is 1 (out of 1 available) 28983 1726883021.70310: exiting _queue_task() for managed_node2/fail 28983 1726883021.70323: done queuing things up, now waiting for results queue to drain 28983 1726883021.70325: waiting for pending results... 28983 1726883021.70509: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 28983 1726883021.70621: in run() - task 0affe814-3a2d-b16d-c0a7-000000000d1a 28983 1726883021.70633: variable 'ansible_search_path' from source: unknown 28983 1726883021.70640: variable 'ansible_search_path' from source: unknown 28983 1726883021.70674: calling self._execute() 28983 1726883021.70750: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883021.70754: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883021.70793: variable 'omit' from source: magic vars 28983 1726883021.71080: variable 'ansible_distribution_major_version' from source: facts 28983 1726883021.71092: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883021.71241: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883021.73289: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883021.73340: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883021.73373: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883021.73405: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883021.73428: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883021.73502: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883021.73524: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883021.73547: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883021.73581: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883021.73593: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883021.73673: variable 'ansible_distribution_major_version' from source: facts 28983 1726883021.73685: Evaluated conditional (ansible_distribution_major_version | int > 9): True 28983 1726883021.73779: variable 'ansible_distribution' from source: facts 28983 1726883021.73783: variable '__network_rh_distros' from source: role '' defaults 28983 1726883021.73793: Evaluated conditional (ansible_distribution in __network_rh_distros): False 28983 1726883021.73796: when evaluation is False, skipping this task 28983 1726883021.73800: _execute() done 28983 1726883021.73803: dumping result to json 28983 1726883021.73809: done dumping result, returning 28983 1726883021.73818: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affe814-3a2d-b16d-c0a7-000000000d1a] 28983 1726883021.73821: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000d1a 28983 1726883021.73917: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000d1a 28983 1726883021.73921: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution in __network_rh_distros", "skip_reason": "Conditional result was False" } 28983 1726883021.73988: no more pending results, returning what we have 28983 1726883021.73991: results queue empty 28983 1726883021.73992: checking for any_errors_fatal 28983 1726883021.74000: done checking for any_errors_fatal 28983 1726883021.74001: checking for max_fail_percentage 28983 1726883021.74003: done checking for max_fail_percentage 28983 1726883021.74004: checking to see if all hosts have failed and the running result is not ok 28983 1726883021.74005: done checking to see if all hosts have failed 28983 1726883021.74006: getting the remaining hosts for this loop 28983 1726883021.74008: done getting the remaining hosts for this loop 28983 1726883021.74013: getting the next task for host managed_node2 28983 1726883021.74021: done getting next task for host managed_node2 28983 1726883021.74025: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 28983 1726883021.74031: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883021.74052: getting variables 28983 1726883021.74054: in VariableManager get_vars() 28983 1726883021.74093: Calling all_inventory to load vars for managed_node2 28983 1726883021.74096: Calling groups_inventory to load vars for managed_node2 28983 1726883021.74098: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883021.74107: Calling all_plugins_play to load vars for managed_node2 28983 1726883021.74110: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883021.74114: Calling groups_plugins_play to load vars for managed_node2 28983 1726883021.75477: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883021.77044: done with get_vars() 28983 1726883021.77068: done getting variables 28983 1726883021.77116: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:43:41 -0400 (0:00:00.070) 0:00:51.769 ****** 28983 1726883021.77144: entering _queue_task() for managed_node2/dnf 28983 1726883021.77370: worker is 1 (out of 1 available) 28983 1726883021.77386: exiting _queue_task() for managed_node2/dnf 28983 1726883021.77399: done queuing things up, now waiting for results queue to drain 28983 1726883021.77401: waiting for pending results... 28983 1726883021.77591: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 28983 1726883021.77700: in run() - task 0affe814-3a2d-b16d-c0a7-000000000d1b 28983 1726883021.77712: variable 'ansible_search_path' from source: unknown 28983 1726883021.77716: variable 'ansible_search_path' from source: unknown 28983 1726883021.77753: calling self._execute() 28983 1726883021.77831: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883021.77840: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883021.77852: variable 'omit' from source: magic vars 28983 1726883021.78155: variable 'ansible_distribution_major_version' from source: facts 28983 1726883021.78166: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883021.78336: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883021.80035: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883021.80090: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883021.80120: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883021.80153: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883021.80178: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883021.80245: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883021.80282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883021.80304: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883021.80337: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883021.80351: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883021.80439: variable 'ansible_distribution' from source: facts 28983 1726883021.80443: variable 'ansible_distribution_major_version' from source: facts 28983 1726883021.80451: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 28983 1726883021.80544: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883021.80655: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883021.80678: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883021.80700: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883021.80731: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883021.80746: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883021.80783: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883021.80807: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883021.80827: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883021.80859: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883021.80874: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883021.80908: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883021.80930: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883021.80953: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883021.80986: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883021.80998: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883021.81136: variable 'network_connections' from source: include params 28983 1726883021.81147: variable 'interface' from source: play vars 28983 1726883021.81202: variable 'interface' from source: play vars 28983 1726883021.81266: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883021.81401: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883021.81433: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883021.81464: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883021.81492: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883021.81526: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883021.81548: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883021.81576: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883021.81598: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883021.81641: variable '__network_team_connections_defined' from source: role '' defaults 28983 1726883021.81852: variable 'network_connections' from source: include params 28983 1726883021.81856: variable 'interface' from source: play vars 28983 1726883021.81910: variable 'interface' from source: play vars 28983 1726883021.81930: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 28983 1726883021.81935: when evaluation is False, skipping this task 28983 1726883021.81938: _execute() done 28983 1726883021.81943: dumping result to json 28983 1726883021.81947: done dumping result, returning 28983 1726883021.81954: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affe814-3a2d-b16d-c0a7-000000000d1b] 28983 1726883021.81960: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000d1b 28983 1726883021.82053: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000d1b 28983 1726883021.82056: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 28983 1726883021.82112: no more pending results, returning what we have 28983 1726883021.82115: results queue empty 28983 1726883021.82116: checking for any_errors_fatal 28983 1726883021.82123: done checking for any_errors_fatal 28983 1726883021.82124: checking for max_fail_percentage 28983 1726883021.82126: done checking for max_fail_percentage 28983 1726883021.82127: checking to see if all hosts have failed and the running result is not ok 28983 1726883021.82128: done checking to see if all hosts have failed 28983 1726883021.82128: getting the remaining hosts for this loop 28983 1726883021.82131: done getting the remaining hosts for this loop 28983 1726883021.82138: getting the next task for host managed_node2 28983 1726883021.82146: done getting next task for host managed_node2 28983 1726883021.82150: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 28983 1726883021.82155: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883021.82176: getting variables 28983 1726883021.82178: in VariableManager get_vars() 28983 1726883021.82211: Calling all_inventory to load vars for managed_node2 28983 1726883021.82214: Calling groups_inventory to load vars for managed_node2 28983 1726883021.82216: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883021.82224: Calling all_plugins_play to load vars for managed_node2 28983 1726883021.82227: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883021.82229: Calling groups_plugins_play to load vars for managed_node2 28983 1726883021.83449: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883021.85114: done with get_vars() 28983 1726883021.85138: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 28983 1726883021.85196: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:43:41 -0400 (0:00:00.080) 0:00:51.849 ****** 28983 1726883021.85220: entering _queue_task() for managed_node2/yum 28983 1726883021.85436: worker is 1 (out of 1 available) 28983 1726883021.85451: exiting _queue_task() for managed_node2/yum 28983 1726883021.85464: done queuing things up, now waiting for results queue to drain 28983 1726883021.85466: waiting for pending results... 28983 1726883021.85652: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 28983 1726883021.85754: in run() - task 0affe814-3a2d-b16d-c0a7-000000000d1c 28983 1726883021.85766: variable 'ansible_search_path' from source: unknown 28983 1726883021.85770: variable 'ansible_search_path' from source: unknown 28983 1726883021.85806: calling self._execute() 28983 1726883021.85885: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883021.85890: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883021.85902: variable 'omit' from source: magic vars 28983 1726883021.86212: variable 'ansible_distribution_major_version' from source: facts 28983 1726883021.86223: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883021.86377: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883021.88109: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883021.88161: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883021.88194: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883021.88226: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883021.88251: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883021.88320: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883021.88353: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883021.88375: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883021.88410: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883021.88423: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883021.88502: variable 'ansible_distribution_major_version' from source: facts 28983 1726883021.88516: Evaluated conditional (ansible_distribution_major_version | int < 8): False 28983 1726883021.88520: when evaluation is False, skipping this task 28983 1726883021.88523: _execute() done 28983 1726883021.88527: dumping result to json 28983 1726883021.88529: done dumping result, returning 28983 1726883021.88540: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affe814-3a2d-b16d-c0a7-000000000d1c] 28983 1726883021.88545: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000d1c skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 28983 1726883021.88692: no more pending results, returning what we have 28983 1726883021.88695: results queue empty 28983 1726883021.88696: checking for any_errors_fatal 28983 1726883021.88702: done checking for any_errors_fatal 28983 1726883021.88703: checking for max_fail_percentage 28983 1726883021.88706: done checking for max_fail_percentage 28983 1726883021.88707: checking to see if all hosts have failed and the running result is not ok 28983 1726883021.88708: done checking to see if all hosts have failed 28983 1726883021.88708: getting the remaining hosts for this loop 28983 1726883021.88710: done getting the remaining hosts for this loop 28983 1726883021.88715: getting the next task for host managed_node2 28983 1726883021.88723: done getting next task for host managed_node2 28983 1726883021.88727: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 28983 1726883021.88733: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883021.88750: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000d1c 28983 1726883021.88753: WORKER PROCESS EXITING 28983 1726883021.88767: getting variables 28983 1726883021.88768: in VariableManager get_vars() 28983 1726883021.88802: Calling all_inventory to load vars for managed_node2 28983 1726883021.88805: Calling groups_inventory to load vars for managed_node2 28983 1726883021.88808: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883021.88816: Calling all_plugins_play to load vars for managed_node2 28983 1726883021.88819: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883021.88823: Calling groups_plugins_play to load vars for managed_node2 28983 1726883021.89993: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883021.91559: done with get_vars() 28983 1726883021.91581: done getting variables 28983 1726883021.91624: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:43:41 -0400 (0:00:00.064) 0:00:51.914 ****** 28983 1726883021.91653: entering _queue_task() for managed_node2/fail 28983 1726883021.91871: worker is 1 (out of 1 available) 28983 1726883021.91884: exiting _queue_task() for managed_node2/fail 28983 1726883021.91897: done queuing things up, now waiting for results queue to drain 28983 1726883021.91899: waiting for pending results... 28983 1726883021.92108: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 28983 1726883021.92222: in run() - task 0affe814-3a2d-b16d-c0a7-000000000d1d 28983 1726883021.92235: variable 'ansible_search_path' from source: unknown 28983 1726883021.92241: variable 'ansible_search_path' from source: unknown 28983 1726883021.92275: calling self._execute() 28983 1726883021.92358: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883021.92363: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883021.92376: variable 'omit' from source: magic vars 28983 1726883021.92693: variable 'ansible_distribution_major_version' from source: facts 28983 1726883021.92704: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883021.92806: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883021.92975: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883021.94942: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883021.94995: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883021.95023: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883021.95055: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883021.95081: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883021.95147: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883021.95174: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883021.95196: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883021.95229: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883021.95244: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883021.95287: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883021.95308: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883021.95329: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883021.95361: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883021.95376: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883021.95414: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883021.95432: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883021.95453: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883021.95485: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883021.95497: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883021.95633: variable 'network_connections' from source: include params 28983 1726883021.95649: variable 'interface' from source: play vars 28983 1726883021.95705: variable 'interface' from source: play vars 28983 1726883021.95770: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883021.95914: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883021.95948: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883021.95978: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883021.96004: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883021.96039: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883021.96060: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883021.96086: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883021.96107: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883021.96150: variable '__network_team_connections_defined' from source: role '' defaults 28983 1726883021.96353: variable 'network_connections' from source: include params 28983 1726883021.96357: variable 'interface' from source: play vars 28983 1726883021.96412: variable 'interface' from source: play vars 28983 1726883021.96431: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 28983 1726883021.96437: when evaluation is False, skipping this task 28983 1726883021.96440: _execute() done 28983 1726883021.96444: dumping result to json 28983 1726883021.96449: done dumping result, returning 28983 1726883021.96456: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affe814-3a2d-b16d-c0a7-000000000d1d] 28983 1726883021.96461: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000d1d skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 28983 1726883021.96611: no more pending results, returning what we have 28983 1726883021.96615: results queue empty 28983 1726883021.96616: checking for any_errors_fatal 28983 1726883021.96624: done checking for any_errors_fatal 28983 1726883021.96625: checking for max_fail_percentage 28983 1726883021.96627: done checking for max_fail_percentage 28983 1726883021.96627: checking to see if all hosts have failed and the running result is not ok 28983 1726883021.96628: done checking to see if all hosts have failed 28983 1726883021.96629: getting the remaining hosts for this loop 28983 1726883021.96631: done getting the remaining hosts for this loop 28983 1726883021.96638: getting the next task for host managed_node2 28983 1726883021.96648: done getting next task for host managed_node2 28983 1726883021.96652: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 28983 1726883021.96658: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883021.96677: getting variables 28983 1726883021.96679: in VariableManager get_vars() 28983 1726883021.96717: Calling all_inventory to load vars for managed_node2 28983 1726883021.96721: Calling groups_inventory to load vars for managed_node2 28983 1726883021.96723: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883021.96732: Calling all_plugins_play to load vars for managed_node2 28983 1726883021.96742: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883021.96747: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000d1d 28983 1726883021.96750: WORKER PROCESS EXITING 28983 1726883021.96754: Calling groups_plugins_play to load vars for managed_node2 28983 1726883021.98145: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883021.99712: done with get_vars() 28983 1726883021.99735: done getting variables 28983 1726883021.99784: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:43:41 -0400 (0:00:00.081) 0:00:51.995 ****** 28983 1726883021.99812: entering _queue_task() for managed_node2/package 28983 1726883022.00032: worker is 1 (out of 1 available) 28983 1726883022.00049: exiting _queue_task() for managed_node2/package 28983 1726883022.00062: done queuing things up, now waiting for results queue to drain 28983 1726883022.00064: waiting for pending results... 28983 1726883022.00290: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 28983 1726883022.00484: in run() - task 0affe814-3a2d-b16d-c0a7-000000000d1e 28983 1726883022.00507: variable 'ansible_search_path' from source: unknown 28983 1726883022.00516: variable 'ansible_search_path' from source: unknown 28983 1726883022.00564: calling self._execute() 28983 1726883022.00675: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883022.00691: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883022.00712: variable 'omit' from source: magic vars 28983 1726883022.01164: variable 'ansible_distribution_major_version' from source: facts 28983 1726883022.01187: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883022.01440: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883022.01765: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883022.01823: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883022.01873: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883022.02140: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883022.02144: variable 'network_packages' from source: role '' defaults 28983 1726883022.02229: variable '__network_provider_setup' from source: role '' defaults 28983 1726883022.02250: variable '__network_service_name_default_nm' from source: role '' defaults 28983 1726883022.02338: variable '__network_service_name_default_nm' from source: role '' defaults 28983 1726883022.02356: variable '__network_packages_default_nm' from source: role '' defaults 28983 1726883022.02442: variable '__network_packages_default_nm' from source: role '' defaults 28983 1726883022.02710: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883022.05023: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883022.05106: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883022.05157: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883022.05204: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883022.05243: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883022.05377: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883022.05415: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883022.05457: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883022.05514: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883022.05540: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883022.05602: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883022.05641: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883022.05679: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883022.05739: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883022.05763: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883022.06066: variable '__network_packages_default_gobject_packages' from source: role '' defaults 28983 1726883022.06359: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883022.06363: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883022.06366: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883022.06368: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883022.06384: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883022.06501: variable 'ansible_python' from source: facts 28983 1726883022.06529: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 28983 1726883022.06635: variable '__network_wpa_supplicant_required' from source: role '' defaults 28983 1726883022.06744: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 28983 1726883022.06914: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883022.06952: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883022.06990: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883022.07047: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883022.07071: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883022.07132: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883022.07179: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883022.07215: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883022.07273: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883022.07299: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883022.07490: variable 'network_connections' from source: include params 28983 1726883022.07503: variable 'interface' from source: play vars 28983 1726883022.07623: variable 'interface' from source: play vars 28983 1726883022.07727: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883022.07769: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883022.07813: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883022.07861: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883022.07920: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883022.08312: variable 'network_connections' from source: include params 28983 1726883022.08324: variable 'interface' from source: play vars 28983 1726883022.08455: variable 'interface' from source: play vars 28983 1726883022.08497: variable '__network_packages_default_wireless' from source: role '' defaults 28983 1726883022.08607: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883022.09093: variable 'network_connections' from source: include params 28983 1726883022.09097: variable 'interface' from source: play vars 28983 1726883022.09143: variable 'interface' from source: play vars 28983 1726883022.09175: variable '__network_packages_default_team' from source: role '' defaults 28983 1726883022.09284: variable '__network_team_connections_defined' from source: role '' defaults 28983 1726883022.09715: variable 'network_connections' from source: include params 28983 1726883022.09728: variable 'interface' from source: play vars 28983 1726883022.09816: variable 'interface' from source: play vars 28983 1726883022.09943: variable '__network_service_name_default_initscripts' from source: role '' defaults 28983 1726883022.09982: variable '__network_service_name_default_initscripts' from source: role '' defaults 28983 1726883022.09996: variable '__network_packages_default_initscripts' from source: role '' defaults 28983 1726883022.10082: variable '__network_packages_default_initscripts' from source: role '' defaults 28983 1726883022.10401: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 28983 1726883022.11087: variable 'network_connections' from source: include params 28983 1726883022.11099: variable 'interface' from source: play vars 28983 1726883022.11186: variable 'interface' from source: play vars 28983 1726883022.11240: variable 'ansible_distribution' from source: facts 28983 1726883022.11244: variable '__network_rh_distros' from source: role '' defaults 28983 1726883022.11247: variable 'ansible_distribution_major_version' from source: facts 28983 1726883022.11249: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 28983 1726883022.11488: variable 'ansible_distribution' from source: facts 28983 1726883022.11500: variable '__network_rh_distros' from source: role '' defaults 28983 1726883022.11513: variable 'ansible_distribution_major_version' from source: facts 28983 1726883022.11525: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 28983 1726883022.11941: variable 'ansible_distribution' from source: facts 28983 1726883022.11944: variable '__network_rh_distros' from source: role '' defaults 28983 1726883022.11947: variable 'ansible_distribution_major_version' from source: facts 28983 1726883022.11949: variable 'network_provider' from source: set_fact 28983 1726883022.11951: variable 'ansible_facts' from source: unknown 28983 1726883022.13224: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 28983 1726883022.13237: when evaluation is False, skipping this task 28983 1726883022.13246: _execute() done 28983 1726883022.13255: dumping result to json 28983 1726883022.13265: done dumping result, returning 28983 1726883022.13283: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [0affe814-3a2d-b16d-c0a7-000000000d1e] 28983 1726883022.13340: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000d1e skipping: [managed_node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 28983 1726883022.13600: no more pending results, returning what we have 28983 1726883022.13605: results queue empty 28983 1726883022.13606: checking for any_errors_fatal 28983 1726883022.13617: done checking for any_errors_fatal 28983 1726883022.13618: checking for max_fail_percentage 28983 1726883022.13621: done checking for max_fail_percentage 28983 1726883022.13622: checking to see if all hosts have failed and the running result is not ok 28983 1726883022.13623: done checking to see if all hosts have failed 28983 1726883022.13624: getting the remaining hosts for this loop 28983 1726883022.13626: done getting the remaining hosts for this loop 28983 1726883022.13632: getting the next task for host managed_node2 28983 1726883022.13644: done getting next task for host managed_node2 28983 1726883022.13650: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 28983 1726883022.13656: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883022.13689: getting variables 28983 1726883022.13691: in VariableManager get_vars() 28983 1726883022.13842: Calling all_inventory to load vars for managed_node2 28983 1726883022.13847: Calling groups_inventory to load vars for managed_node2 28983 1726883022.13854: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000d1e 28983 1726883022.13867: WORKER PROCESS EXITING 28983 1726883022.13862: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883022.13880: Calling all_plugins_play to load vars for managed_node2 28983 1726883022.13885: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883022.13890: Calling groups_plugins_play to load vars for managed_node2 28983 1726883022.16242: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883022.19279: done with get_vars() 28983 1726883022.19315: done getting variables 28983 1726883022.19388: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:43:42 -0400 (0:00:00.196) 0:00:52.192 ****** 28983 1726883022.19430: entering _queue_task() for managed_node2/package 28983 1726883022.19785: worker is 1 (out of 1 available) 28983 1726883022.19800: exiting _queue_task() for managed_node2/package 28983 1726883022.19814: done queuing things up, now waiting for results queue to drain 28983 1726883022.19816: waiting for pending results... 28983 1726883022.20144: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 28983 1726883022.20321: in run() - task 0affe814-3a2d-b16d-c0a7-000000000d1f 28983 1726883022.20349: variable 'ansible_search_path' from source: unknown 28983 1726883022.20359: variable 'ansible_search_path' from source: unknown 28983 1726883022.20409: calling self._execute() 28983 1726883022.20530: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883022.20548: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883022.20568: variable 'omit' from source: magic vars 28983 1726883022.21033: variable 'ansible_distribution_major_version' from source: facts 28983 1726883022.21054: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883022.21218: variable 'network_state' from source: role '' defaults 28983 1726883022.21241: Evaluated conditional (network_state != {}): False 28983 1726883022.21251: when evaluation is False, skipping this task 28983 1726883022.21260: _execute() done 28983 1726883022.21269: dumping result to json 28983 1726883022.21278: done dumping result, returning 28983 1726883022.21293: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affe814-3a2d-b16d-c0a7-000000000d1f] 28983 1726883022.21351: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000d1f skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28983 1726883022.21592: no more pending results, returning what we have 28983 1726883022.21597: results queue empty 28983 1726883022.21598: checking for any_errors_fatal 28983 1726883022.21606: done checking for any_errors_fatal 28983 1726883022.21607: checking for max_fail_percentage 28983 1726883022.21611: done checking for max_fail_percentage 28983 1726883022.21612: checking to see if all hosts have failed and the running result is not ok 28983 1726883022.21613: done checking to see if all hosts have failed 28983 1726883022.21614: getting the remaining hosts for this loop 28983 1726883022.21616: done getting the remaining hosts for this loop 28983 1726883022.21621: getting the next task for host managed_node2 28983 1726883022.21632: done getting next task for host managed_node2 28983 1726883022.21639: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 28983 1726883022.21646: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883022.21672: getting variables 28983 1726883022.21674: in VariableManager get_vars() 28983 1726883022.21720: Calling all_inventory to load vars for managed_node2 28983 1726883022.21724: Calling groups_inventory to load vars for managed_node2 28983 1726883022.21727: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883022.21924: Calling all_plugins_play to load vars for managed_node2 28983 1726883022.21929: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883022.21938: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000d1f 28983 1726883022.21941: WORKER PROCESS EXITING 28983 1726883022.21945: Calling groups_plugins_play to load vars for managed_node2 28983 1726883022.24056: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883022.26979: done with get_vars() 28983 1726883022.27018: done getting variables 28983 1726883022.27091: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:43:42 -0400 (0:00:00.077) 0:00:52.269 ****** 28983 1726883022.27136: entering _queue_task() for managed_node2/package 28983 1726883022.27481: worker is 1 (out of 1 available) 28983 1726883022.27497: exiting _queue_task() for managed_node2/package 28983 1726883022.27511: done queuing things up, now waiting for results queue to drain 28983 1726883022.27513: waiting for pending results... 28983 1726883022.27828: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 28983 1726883022.28003: in run() - task 0affe814-3a2d-b16d-c0a7-000000000d20 28983 1726883022.28027: variable 'ansible_search_path' from source: unknown 28983 1726883022.28039: variable 'ansible_search_path' from source: unknown 28983 1726883022.28089: calling self._execute() 28983 1726883022.28208: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883022.28221: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883022.28242: variable 'omit' from source: magic vars 28983 1726883022.28690: variable 'ansible_distribution_major_version' from source: facts 28983 1726883022.28709: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883022.28874: variable 'network_state' from source: role '' defaults 28983 1726883022.28894: Evaluated conditional (network_state != {}): False 28983 1726883022.28903: when evaluation is False, skipping this task 28983 1726883022.28910: _execute() done 28983 1726883022.28918: dumping result to json 28983 1726883022.28926: done dumping result, returning 28983 1726883022.28942: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affe814-3a2d-b16d-c0a7-000000000d20] 28983 1726883022.28954: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000d20 28983 1726883022.29084: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000d20 skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28983 1726883022.29146: no more pending results, returning what we have 28983 1726883022.29150: results queue empty 28983 1726883022.29151: checking for any_errors_fatal 28983 1726883022.29159: done checking for any_errors_fatal 28983 1726883022.29160: checking for max_fail_percentage 28983 1726883022.29162: done checking for max_fail_percentage 28983 1726883022.29163: checking to see if all hosts have failed and the running result is not ok 28983 1726883022.29164: done checking to see if all hosts have failed 28983 1726883022.29165: getting the remaining hosts for this loop 28983 1726883022.29168: done getting the remaining hosts for this loop 28983 1726883022.29173: getting the next task for host managed_node2 28983 1726883022.29183: done getting next task for host managed_node2 28983 1726883022.29188: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 28983 1726883022.29196: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883022.29220: getting variables 28983 1726883022.29222: in VariableManager get_vars() 28983 1726883022.29271: Calling all_inventory to load vars for managed_node2 28983 1726883022.29275: Calling groups_inventory to load vars for managed_node2 28983 1726883022.29278: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883022.29291: Calling all_plugins_play to load vars for managed_node2 28983 1726883022.29295: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883022.29299: Calling groups_plugins_play to load vars for managed_node2 28983 1726883022.29951: WORKER PROCESS EXITING 28983 1726883022.37762: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883022.40857: done with get_vars() 28983 1726883022.40898: done getting variables 28983 1726883022.40965: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:43:42 -0400 (0:00:00.138) 0:00:52.407 ****** 28983 1726883022.41005: entering _queue_task() for managed_node2/service 28983 1726883022.41560: worker is 1 (out of 1 available) 28983 1726883022.41575: exiting _queue_task() for managed_node2/service 28983 1726883022.41587: done queuing things up, now waiting for results queue to drain 28983 1726883022.41590: waiting for pending results... 28983 1726883022.41806: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 28983 1726883022.42026: in run() - task 0affe814-3a2d-b16d-c0a7-000000000d21 28983 1726883022.42060: variable 'ansible_search_path' from source: unknown 28983 1726883022.42074: variable 'ansible_search_path' from source: unknown 28983 1726883022.42125: calling self._execute() 28983 1726883022.42259: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883022.42281: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883022.42340: variable 'omit' from source: magic vars 28983 1726883022.42824: variable 'ansible_distribution_major_version' from source: facts 28983 1726883022.42846: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883022.43030: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883022.43330: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883022.46159: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883022.46290: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883022.46340: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883022.46398: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883022.46549: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883022.46553: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883022.46595: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883022.46630: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883022.46707: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883022.46732: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883022.46809: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883022.46845: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883022.46890: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883022.46952: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883022.46976: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883022.47046: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883022.47083: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883022.47215: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883022.47219: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883022.47221: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883022.47484: variable 'network_connections' from source: include params 28983 1726883022.47502: variable 'interface' from source: play vars 28983 1726883022.47599: variable 'interface' from source: play vars 28983 1726883022.48049: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883022.48211: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883022.48351: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883022.48405: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883022.48702: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883022.48706: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883022.48709: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883022.48827: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883022.48869: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883022.49052: variable '__network_team_connections_defined' from source: role '' defaults 28983 1726883022.49511: variable 'network_connections' from source: include params 28983 1726883022.49522: variable 'interface' from source: play vars 28983 1726883022.49614: variable 'interface' from source: play vars 28983 1726883022.49649: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 28983 1726883022.49658: when evaluation is False, skipping this task 28983 1726883022.49665: _execute() done 28983 1726883022.49681: dumping result to json 28983 1726883022.49695: done dumping result, returning 28983 1726883022.49706: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affe814-3a2d-b16d-c0a7-000000000d21] 28983 1726883022.49716: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000d21 skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 28983 1726883022.49905: no more pending results, returning what we have 28983 1726883022.49909: results queue empty 28983 1726883022.49910: checking for any_errors_fatal 28983 1726883022.49923: done checking for any_errors_fatal 28983 1726883022.49924: checking for max_fail_percentage 28983 1726883022.49926: done checking for max_fail_percentage 28983 1726883022.49927: checking to see if all hosts have failed and the running result is not ok 28983 1726883022.49928: done checking to see if all hosts have failed 28983 1726883022.49929: getting the remaining hosts for this loop 28983 1726883022.49941: done getting the remaining hosts for this loop 28983 1726883022.49948: getting the next task for host managed_node2 28983 1726883022.49959: done getting next task for host managed_node2 28983 1726883022.49965: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 28983 1726883022.49975: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883022.50001: getting variables 28983 1726883022.50003: in VariableManager get_vars() 28983 1726883022.50181: Calling all_inventory to load vars for managed_node2 28983 1726883022.50185: Calling groups_inventory to load vars for managed_node2 28983 1726883022.50188: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883022.50195: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000d21 28983 1726883022.50198: WORKER PROCESS EXITING 28983 1726883022.50207: Calling all_plugins_play to load vars for managed_node2 28983 1726883022.50212: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883022.50215: Calling groups_plugins_play to load vars for managed_node2 28983 1726883022.52526: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883022.55620: done with get_vars() 28983 1726883022.55659: done getting variables 28983 1726883022.55728: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:43:42 -0400 (0:00:00.147) 0:00:52.555 ****** 28983 1726883022.55770: entering _queue_task() for managed_node2/service 28983 1726883022.56128: worker is 1 (out of 1 available) 28983 1726883022.56246: exiting _queue_task() for managed_node2/service 28983 1726883022.56261: done queuing things up, now waiting for results queue to drain 28983 1726883022.56263: waiting for pending results... 28983 1726883022.56494: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 28983 1726883022.56680: in run() - task 0affe814-3a2d-b16d-c0a7-000000000d22 28983 1726883022.56706: variable 'ansible_search_path' from source: unknown 28983 1726883022.56721: variable 'ansible_search_path' from source: unknown 28983 1726883022.56768: calling self._execute() 28983 1726883022.56889: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883022.56904: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883022.56922: variable 'omit' from source: magic vars 28983 1726883022.57392: variable 'ansible_distribution_major_version' from source: facts 28983 1726883022.57411: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883022.57638: variable 'network_provider' from source: set_fact 28983 1726883022.57651: variable 'network_state' from source: role '' defaults 28983 1726883022.57668: Evaluated conditional (network_provider == "nm" or network_state != {}): True 28983 1726883022.57679: variable 'omit' from source: magic vars 28983 1726883022.57770: variable 'omit' from source: magic vars 28983 1726883022.57811: variable 'network_service_name' from source: role '' defaults 28983 1726883022.57897: variable 'network_service_name' from source: role '' defaults 28983 1726883022.58048: variable '__network_provider_setup' from source: role '' defaults 28983 1726883022.58061: variable '__network_service_name_default_nm' from source: role '' defaults 28983 1726883022.58154: variable '__network_service_name_default_nm' from source: role '' defaults 28983 1726883022.58169: variable '__network_packages_default_nm' from source: role '' defaults 28983 1726883022.58341: variable '__network_packages_default_nm' from source: role '' defaults 28983 1726883022.58581: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883022.61181: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883022.61269: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883022.61323: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883022.61371: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883022.61409: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883022.61503: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883022.61549: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883022.61583: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883022.61632: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883022.61739: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883022.61743: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883022.61756: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883022.61793: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883022.61843: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883022.61867: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883022.62169: variable '__network_packages_default_gobject_packages' from source: role '' defaults 28983 1726883022.62325: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883022.62416: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883022.62420: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883022.62443: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883022.62465: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883022.62579: variable 'ansible_python' from source: facts 28983 1726883022.62600: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 28983 1726883022.62702: variable '__network_wpa_supplicant_required' from source: role '' defaults 28983 1726883022.62802: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 28983 1726883022.62976: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883022.63007: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883022.63040: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883022.63100: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883022.63178: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883022.63187: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883022.63229: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883022.63268: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883022.63324: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883022.63347: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883022.63529: variable 'network_connections' from source: include params 28983 1726883022.63544: variable 'interface' from source: play vars 28983 1726883022.63630: variable 'interface' from source: play vars 28983 1726883022.63840: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883022.64002: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883022.64071: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883022.64125: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883022.64186: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883022.64262: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883022.64307: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883022.64354: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883022.64402: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883022.64462: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883022.64933: variable 'network_connections' from source: include params 28983 1726883022.64937: variable 'interface' from source: play vars 28983 1726883022.64961: variable 'interface' from source: play vars 28983 1726883022.65002: variable '__network_packages_default_wireless' from source: role '' defaults 28983 1726883022.65108: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883022.65512: variable 'network_connections' from source: include params 28983 1726883022.65523: variable 'interface' from source: play vars 28983 1726883022.65616: variable 'interface' from source: play vars 28983 1726883022.65650: variable '__network_packages_default_team' from source: role '' defaults 28983 1726883022.65758: variable '__network_team_connections_defined' from source: role '' defaults 28983 1726883022.66181: variable 'network_connections' from source: include params 28983 1726883022.66192: variable 'interface' from source: play vars 28983 1726883022.66339: variable 'interface' from source: play vars 28983 1726883022.66360: variable '__network_service_name_default_initscripts' from source: role '' defaults 28983 1726883022.66443: variable '__network_service_name_default_initscripts' from source: role '' defaults 28983 1726883022.66457: variable '__network_packages_default_initscripts' from source: role '' defaults 28983 1726883022.66544: variable '__network_packages_default_initscripts' from source: role '' defaults 28983 1726883022.66875: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 28983 1726883022.67591: variable 'network_connections' from source: include params 28983 1726883022.67601: variable 'interface' from source: play vars 28983 1726883022.67688: variable 'interface' from source: play vars 28983 1726883022.67702: variable 'ansible_distribution' from source: facts 28983 1726883022.67740: variable '__network_rh_distros' from source: role '' defaults 28983 1726883022.67743: variable 'ansible_distribution_major_version' from source: facts 28983 1726883022.67745: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 28983 1726883022.67991: variable 'ansible_distribution' from source: facts 28983 1726883022.68003: variable '__network_rh_distros' from source: role '' defaults 28983 1726883022.68018: variable 'ansible_distribution_major_version' from source: facts 28983 1726883022.68029: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 28983 1726883022.68322: variable 'ansible_distribution' from source: facts 28983 1726883022.68326: variable '__network_rh_distros' from source: role '' defaults 28983 1726883022.68328: variable 'ansible_distribution_major_version' from source: facts 28983 1726883022.68346: variable 'network_provider' from source: set_fact 28983 1726883022.68375: variable 'omit' from source: magic vars 28983 1726883022.68408: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883022.68454: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883022.68483: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883022.68509: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883022.68540: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883022.68573: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883022.68583: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883022.68647: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883022.68731: Set connection var ansible_connection to ssh 28983 1726883022.68752: Set connection var ansible_shell_executable to /bin/sh 28983 1726883022.68772: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883022.68791: Set connection var ansible_timeout to 10 28983 1726883022.68802: Set connection var ansible_pipelining to False 28983 1726883022.68809: Set connection var ansible_shell_type to sh 28983 1726883022.68842: variable 'ansible_shell_executable' from source: unknown 28983 1726883022.68850: variable 'ansible_connection' from source: unknown 28983 1726883022.68858: variable 'ansible_module_compression' from source: unknown 28983 1726883022.68876: variable 'ansible_shell_type' from source: unknown 28983 1726883022.68879: variable 'ansible_shell_executable' from source: unknown 28983 1726883022.68939: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883022.68943: variable 'ansible_pipelining' from source: unknown 28983 1726883022.68945: variable 'ansible_timeout' from source: unknown 28983 1726883022.68947: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883022.69040: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883022.69066: variable 'omit' from source: magic vars 28983 1726883022.69078: starting attempt loop 28983 1726883022.69085: running the handler 28983 1726883022.69191: variable 'ansible_facts' from source: unknown 28983 1726883022.70501: _low_level_execute_command(): starting 28983 1726883022.70504: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726883022.71301: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883022.71317: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883022.71396: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883022.71427: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883022.71443: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883022.71555: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883022.73353: stdout chunk (state=3): >>>/root <<< 28983 1726883022.73550: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883022.73553: stdout chunk (state=3): >>><<< 28983 1726883022.73556: stderr chunk (state=3): >>><<< 28983 1726883022.73575: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883022.73676: _low_level_execute_command(): starting 28983 1726883022.73681: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726883022.735824-30837-39043560309543 `" && echo ansible-tmp-1726883022.735824-30837-39043560309543="` echo /root/.ansible/tmp/ansible-tmp-1726883022.735824-30837-39043560309543 `" ) && sleep 0' 28983 1726883022.74219: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883022.74239: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883022.74259: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883022.74285: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883022.74304: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726883022.74349: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883022.74428: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883022.74462: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883022.74512: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883022.74583: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883022.76593: stdout chunk (state=3): >>>ansible-tmp-1726883022.735824-30837-39043560309543=/root/.ansible/tmp/ansible-tmp-1726883022.735824-30837-39043560309543 <<< 28983 1726883022.76775: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883022.76791: stdout chunk (state=3): >>><<< 28983 1726883022.76803: stderr chunk (state=3): >>><<< 28983 1726883022.76822: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726883022.735824-30837-39043560309543=/root/.ansible/tmp/ansible-tmp-1726883022.735824-30837-39043560309543 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883022.76863: variable 'ansible_module_compression' from source: unknown 28983 1726883022.76926: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 28983 1726883022.76988: variable 'ansible_facts' from source: unknown 28983 1726883022.77308: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726883022.735824-30837-39043560309543/AnsiballZ_systemd.py 28983 1726883022.77427: Sending initial data 28983 1726883022.77430: Sent initial data (154 bytes) 28983 1726883022.78113: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883022.78171: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883022.79864: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726883022.79919: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726883022.80007: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpf42i367y /root/.ansible/tmp/ansible-tmp-1726883022.735824-30837-39043560309543/AnsiballZ_systemd.py <<< 28983 1726883022.80020: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726883022.735824-30837-39043560309543/AnsiballZ_systemd.py" <<< 28983 1726883022.80104: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpf42i367y" to remote "/root/.ansible/tmp/ansible-tmp-1726883022.735824-30837-39043560309543/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726883022.735824-30837-39043560309543/AnsiballZ_systemd.py" <<< 28983 1726883022.82715: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883022.82751: stderr chunk (state=3): >>><<< 28983 1726883022.82761: stdout chunk (state=3): >>><<< 28983 1726883022.82791: done transferring module to remote 28983 1726883022.82809: _low_level_execute_command(): starting 28983 1726883022.82834: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726883022.735824-30837-39043560309543/ /root/.ansible/tmp/ansible-tmp-1726883022.735824-30837-39043560309543/AnsiballZ_systemd.py && sleep 0' 28983 1726883022.83436: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883022.83452: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883022.83468: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883022.83493: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883022.83552: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883022.83619: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883022.83644: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883022.83667: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883022.83766: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883022.85797: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883022.85800: stdout chunk (state=3): >>><<< 28983 1726883022.85803: stderr chunk (state=3): >>><<< 28983 1726883022.85818: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883022.85827: _low_level_execute_command(): starting 28983 1726883022.85838: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726883022.735824-30837-39043560309543/AnsiballZ_systemd.py && sleep 0' 28983 1726883022.86473: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883022.86483: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883022.86488: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883022.86494: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883022.86497: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726883022.86605: stderr chunk (state=3): >>>debug2: match not found <<< 28983 1726883022.86609: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883022.86612: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883022.86650: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883022.86754: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883022.86855: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883023.19611: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3307", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ExecMainStartTimestampMonotonic": "237115079", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3307", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4837", "MemoryCurrent": "4521984", "MemoryAvailable": "infinity", "CPUUsageNSec": "1541275000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target NetworkManager-wait-online.service cloud-init.service network.target network.service multi-user.target", "After": "systemd-journald.socket dbus-broker.service dbus.socket system.slice cloud-init-local.service sysinit.target network-pre.target basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:40:53 EDT", "StateChangeTimestampMonotonic": "816797668", "InactiveExitTimestamp": "Fri 2024-09-20 21:31:13 EDT", "InactiveExitTimestampMonotonic": "237115438", "ActiveEnterTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ActiveEnterTimestampMonotonic": "237210316", "ActiveExitTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ActiveExitTimestampMonotonic": "237082173", "InactiveEnterTimestamp": "Fri 2024-09-20 21:31:13 EDT", "InactiveEnterTimestampMonotonic": "237109124", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ConditionTimestampMonotonic": "237109810", "AssertTimestamp": "Fri 2024-09-20 21:31:13 EDT", "AssertTimestampMonotonic": "237109813", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ac5f6302898c463683c4bdf580ab7e3e", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 28983 1726883023.21983: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. <<< 28983 1726883023.22090: stderr chunk (state=3): >>><<< 28983 1726883023.22094: stdout chunk (state=3): >>><<< 28983 1726883023.22201: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3307", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ExecMainStartTimestampMonotonic": "237115079", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3307", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4837", "MemoryCurrent": "4521984", "MemoryAvailable": "infinity", "CPUUsageNSec": "1541275000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target NetworkManager-wait-online.service cloud-init.service network.target network.service multi-user.target", "After": "systemd-journald.socket dbus-broker.service dbus.socket system.slice cloud-init-local.service sysinit.target network-pre.target basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:40:53 EDT", "StateChangeTimestampMonotonic": "816797668", "InactiveExitTimestamp": "Fri 2024-09-20 21:31:13 EDT", "InactiveExitTimestampMonotonic": "237115438", "ActiveEnterTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ActiveEnterTimestampMonotonic": "237210316", "ActiveExitTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ActiveExitTimestampMonotonic": "237082173", "InactiveEnterTimestamp": "Fri 2024-09-20 21:31:13 EDT", "InactiveEnterTimestampMonotonic": "237109124", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ConditionTimestampMonotonic": "237109810", "AssertTimestamp": "Fri 2024-09-20 21:31:13 EDT", "AssertTimestampMonotonic": "237109813", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ac5f6302898c463683c4bdf580ab7e3e", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726883023.22711: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726883022.735824-30837-39043560309543/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726883023.22742: _low_level_execute_command(): starting 28983 1726883023.22767: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726883022.735824-30837-39043560309543/ > /dev/null 2>&1 && sleep 0' 28983 1726883023.23986: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883023.23993: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883023.24082: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883023.24325: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883023.24329: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883023.24331: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883023.24516: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883023.26381: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883023.26451: stderr chunk (state=3): >>><<< 28983 1726883023.26455: stdout chunk (state=3): >>><<< 28983 1726883023.26479: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883023.26488: handler run complete 28983 1726883023.26575: attempt loop complete, returning result 28983 1726883023.26648: _execute() done 28983 1726883023.26651: dumping result to json 28983 1726883023.26692: done dumping result, returning 28983 1726883023.26696: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affe814-3a2d-b16d-c0a7-000000000d22] 28983 1726883023.26802: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000d22 28983 1726883023.27469: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000d22 28983 1726883023.27472: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28983 1726883023.27607: no more pending results, returning what we have 28983 1726883023.27611: results queue empty 28983 1726883023.27612: checking for any_errors_fatal 28983 1726883023.27620: done checking for any_errors_fatal 28983 1726883023.27621: checking for max_fail_percentage 28983 1726883023.27623: done checking for max_fail_percentage 28983 1726883023.27624: checking to see if all hosts have failed and the running result is not ok 28983 1726883023.27625: done checking to see if all hosts have failed 28983 1726883023.27626: getting the remaining hosts for this loop 28983 1726883023.27629: done getting the remaining hosts for this loop 28983 1726883023.27636: getting the next task for host managed_node2 28983 1726883023.27645: done getting next task for host managed_node2 28983 1726883023.27649: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 28983 1726883023.27657: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883023.27674: getting variables 28983 1726883023.27676: in VariableManager get_vars() 28983 1726883023.27719: Calling all_inventory to load vars for managed_node2 28983 1726883023.27722: Calling groups_inventory to load vars for managed_node2 28983 1726883023.27725: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883023.28139: Calling all_plugins_play to load vars for managed_node2 28983 1726883023.28145: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883023.28150: Calling groups_plugins_play to load vars for managed_node2 28983 1726883023.32729: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883023.38831: done with get_vars() 28983 1726883023.39085: done getting variables 28983 1726883023.39160: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:43:43 -0400 (0:00:00.834) 0:00:53.389 ****** 28983 1726883023.39211: entering _queue_task() for managed_node2/service 28983 1726883023.40010: worker is 1 (out of 1 available) 28983 1726883023.40026: exiting _queue_task() for managed_node2/service 28983 1726883023.40243: done queuing things up, now waiting for results queue to drain 28983 1726883023.40246: waiting for pending results... 28983 1726883023.40975: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 28983 1726883023.40981: in run() - task 0affe814-3a2d-b16d-c0a7-000000000d23 28983 1726883023.41073: variable 'ansible_search_path' from source: unknown 28983 1726883023.41078: variable 'ansible_search_path' from source: unknown 28983 1726883023.41119: calling self._execute() 28983 1726883023.41409: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883023.41424: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883023.41461: variable 'omit' from source: magic vars 28983 1726883023.42602: variable 'ansible_distribution_major_version' from source: facts 28983 1726883023.42606: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883023.42854: variable 'network_provider' from source: set_fact 28983 1726883023.42873: Evaluated conditional (network_provider == "nm"): True 28983 1726883023.43107: variable '__network_wpa_supplicant_required' from source: role '' defaults 28983 1726883023.43539: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 28983 1726883023.43880: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883023.47620: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883023.47710: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883023.47759: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883023.47813: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883023.47852: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883023.47955: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883023.47995: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883023.48039: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883023.48100: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883023.48127: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883023.48192: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883023.48233: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883023.48271: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883023.48333: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883023.48436: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883023.48442: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883023.48454: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883023.48491: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883023.48550: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883023.48573: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883023.48908: variable 'network_connections' from source: include params 28983 1726883023.48931: variable 'interface' from source: play vars 28983 1726883023.49242: variable 'interface' from source: play vars 28983 1726883023.49291: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883023.49566: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883023.49624: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883023.49687: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883023.49715: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883023.49774: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883023.49839: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883023.49852: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883023.49894: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883023.49960: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883023.50325: variable 'network_connections' from source: include params 28983 1726883023.50342: variable 'interface' from source: play vars 28983 1726883023.50427: variable 'interface' from source: play vars 28983 1726883023.50519: Evaluated conditional (__network_wpa_supplicant_required): False 28983 1726883023.50528: when evaluation is False, skipping this task 28983 1726883023.50537: _execute() done 28983 1726883023.50546: dumping result to json 28983 1726883023.50560: done dumping result, returning 28983 1726883023.50613: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affe814-3a2d-b16d-c0a7-000000000d23] 28983 1726883023.50625: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000d23 28983 1726883023.51043: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000d23 28983 1726883023.51047: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 28983 1726883023.51108: no more pending results, returning what we have 28983 1726883023.51111: results queue empty 28983 1726883023.51112: checking for any_errors_fatal 28983 1726883023.51139: done checking for any_errors_fatal 28983 1726883023.51140: checking for max_fail_percentage 28983 1726883023.51142: done checking for max_fail_percentage 28983 1726883023.51144: checking to see if all hosts have failed and the running result is not ok 28983 1726883023.51145: done checking to see if all hosts have failed 28983 1726883023.51146: getting the remaining hosts for this loop 28983 1726883023.51148: done getting the remaining hosts for this loop 28983 1726883023.51152: getting the next task for host managed_node2 28983 1726883023.51163: done getting next task for host managed_node2 28983 1726883023.51168: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 28983 1726883023.51177: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883023.51200: getting variables 28983 1726883023.51202: in VariableManager get_vars() 28983 1726883023.51284: Calling all_inventory to load vars for managed_node2 28983 1726883023.51288: Calling groups_inventory to load vars for managed_node2 28983 1726883023.51291: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883023.51302: Calling all_plugins_play to load vars for managed_node2 28983 1726883023.51307: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883023.51311: Calling groups_plugins_play to load vars for managed_node2 28983 1726883023.54004: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883023.58623: done with get_vars() 28983 1726883023.58668: done getting variables 28983 1726883023.58749: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:43:43 -0400 (0:00:00.195) 0:00:53.585 ****** 28983 1726883023.58793: entering _queue_task() for managed_node2/service 28983 1726883023.59229: worker is 1 (out of 1 available) 28983 1726883023.59247: exiting _queue_task() for managed_node2/service 28983 1726883023.59265: done queuing things up, now waiting for results queue to drain 28983 1726883023.59268: waiting for pending results... 28983 1726883023.59596: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 28983 1726883023.59757: in run() - task 0affe814-3a2d-b16d-c0a7-000000000d24 28983 1726883023.59784: variable 'ansible_search_path' from source: unknown 28983 1726883023.59822: variable 'ansible_search_path' from source: unknown 28983 1726883023.59869: calling self._execute() 28983 1726883023.60022: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883023.60038: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883023.60053: variable 'omit' from source: magic vars 28983 1726883023.60971: variable 'ansible_distribution_major_version' from source: facts 28983 1726883023.60976: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883023.61203: variable 'network_provider' from source: set_fact 28983 1726883023.61296: Evaluated conditional (network_provider == "initscripts"): False 28983 1726883023.61299: when evaluation is False, skipping this task 28983 1726883023.61302: _execute() done 28983 1726883023.61304: dumping result to json 28983 1726883023.61306: done dumping result, returning 28983 1726883023.61309: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [0affe814-3a2d-b16d-c0a7-000000000d24] 28983 1726883023.61367: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000d24 skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28983 1726883023.61723: no more pending results, returning what we have 28983 1726883023.61727: results queue empty 28983 1726883023.61728: checking for any_errors_fatal 28983 1726883023.61744: done checking for any_errors_fatal 28983 1726883023.61745: checking for max_fail_percentage 28983 1726883023.61747: done checking for max_fail_percentage 28983 1726883023.61749: checking to see if all hosts have failed and the running result is not ok 28983 1726883023.61750: done checking to see if all hosts have failed 28983 1726883023.61751: getting the remaining hosts for this loop 28983 1726883023.61753: done getting the remaining hosts for this loop 28983 1726883023.61758: getting the next task for host managed_node2 28983 1726883023.61768: done getting next task for host managed_node2 28983 1726883023.61773: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 28983 1726883023.61785: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883023.61811: getting variables 28983 1726883023.61813: in VariableManager get_vars() 28983 1726883023.62072: Calling all_inventory to load vars for managed_node2 28983 1726883023.62076: Calling groups_inventory to load vars for managed_node2 28983 1726883023.62080: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883023.62096: Calling all_plugins_play to load vars for managed_node2 28983 1726883023.62101: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883023.62106: Calling groups_plugins_play to load vars for managed_node2 28983 1726883023.62841: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000d24 28983 1726883023.62846: WORKER PROCESS EXITING 28983 1726883023.66519: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883023.70874: done with get_vars() 28983 1726883023.70919: done getting variables 28983 1726883023.71032: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:43:43 -0400 (0:00:00.122) 0:00:53.708 ****** 28983 1726883023.71085: entering _queue_task() for managed_node2/copy 28983 1726883023.71523: worker is 1 (out of 1 available) 28983 1726883023.71739: exiting _queue_task() for managed_node2/copy 28983 1726883023.71751: done queuing things up, now waiting for results queue to drain 28983 1726883023.71753: waiting for pending results... 28983 1726883023.71894: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 28983 1726883023.72090: in run() - task 0affe814-3a2d-b16d-c0a7-000000000d25 28983 1726883023.72113: variable 'ansible_search_path' from source: unknown 28983 1726883023.72122: variable 'ansible_search_path' from source: unknown 28983 1726883023.72169: calling self._execute() 28983 1726883023.72312: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883023.72326: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883023.72344: variable 'omit' from source: magic vars 28983 1726883023.73296: variable 'ansible_distribution_major_version' from source: facts 28983 1726883023.73367: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883023.73946: variable 'network_provider' from source: set_fact 28983 1726883023.73950: Evaluated conditional (network_provider == "initscripts"): False 28983 1726883023.73952: when evaluation is False, skipping this task 28983 1726883023.73955: _execute() done 28983 1726883023.73958: dumping result to json 28983 1726883023.73960: done dumping result, returning 28983 1726883023.73965: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affe814-3a2d-b16d-c0a7-000000000d25] 28983 1726883023.74124: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000d25 skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 28983 1726883023.74617: no more pending results, returning what we have 28983 1726883023.74621: results queue empty 28983 1726883023.74622: checking for any_errors_fatal 28983 1726883023.74631: done checking for any_errors_fatal 28983 1726883023.74632: checking for max_fail_percentage 28983 1726883023.74636: done checking for max_fail_percentage 28983 1726883023.74637: checking to see if all hosts have failed and the running result is not ok 28983 1726883023.74638: done checking to see if all hosts have failed 28983 1726883023.74639: getting the remaining hosts for this loop 28983 1726883023.74642: done getting the remaining hosts for this loop 28983 1726883023.74647: getting the next task for host managed_node2 28983 1726883023.74657: done getting next task for host managed_node2 28983 1726883023.74661: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 28983 1726883023.74669: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883023.74693: getting variables 28983 1726883023.74695: in VariableManager get_vars() 28983 1726883023.75090: Calling all_inventory to load vars for managed_node2 28983 1726883023.75094: Calling groups_inventory to load vars for managed_node2 28983 1726883023.75100: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883023.75111: Calling all_plugins_play to load vars for managed_node2 28983 1726883023.75115: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883023.75119: Calling groups_plugins_play to load vars for managed_node2 28983 1726883023.75758: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000d25 28983 1726883023.75762: WORKER PROCESS EXITING 28983 1726883023.77594: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883023.82723: done with get_vars() 28983 1726883023.82777: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:43:43 -0400 (0:00:00.119) 0:00:53.827 ****** 28983 1726883023.83023: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 28983 1726883023.83960: worker is 1 (out of 1 available) 28983 1726883023.83979: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 28983 1726883023.83994: done queuing things up, now waiting for results queue to drain 28983 1726883023.83996: waiting for pending results... 28983 1726883023.84552: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 28983 1726883023.84980: in run() - task 0affe814-3a2d-b16d-c0a7-000000000d26 28983 1726883023.85129: variable 'ansible_search_path' from source: unknown 28983 1726883023.85141: variable 'ansible_search_path' from source: unknown 28983 1726883023.85322: calling self._execute() 28983 1726883023.85429: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883023.85741: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883023.85745: variable 'omit' from source: magic vars 28983 1726883023.86553: variable 'ansible_distribution_major_version' from source: facts 28983 1726883023.86576: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883023.86588: variable 'omit' from source: magic vars 28983 1726883023.86798: variable 'omit' from source: magic vars 28983 1726883023.87236: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883023.91407: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883023.91508: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883023.91576: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883023.91640: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883023.91686: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883023.91801: variable 'network_provider' from source: set_fact 28983 1726883023.92004: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883023.92051: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883023.92100: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883023.92172: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883023.92206: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883023.92387: variable 'omit' from source: magic vars 28983 1726883023.92470: variable 'omit' from source: magic vars 28983 1726883023.92639: variable 'network_connections' from source: include params 28983 1726883023.92659: variable 'interface' from source: play vars 28983 1726883023.92748: variable 'interface' from source: play vars 28983 1726883023.92942: variable 'omit' from source: magic vars 28983 1726883023.92978: variable '__lsr_ansible_managed' from source: task vars 28983 1726883023.93070: variable '__lsr_ansible_managed' from source: task vars 28983 1726883023.93358: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 28983 1726883023.93702: Loaded config def from plugin (lookup/template) 28983 1726883023.93726: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 28983 1726883023.93806: File lookup term: get_ansible_managed.j2 28983 1726883023.93809: variable 'ansible_search_path' from source: unknown 28983 1726883023.93813: evaluation_path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 28983 1726883023.93818: search_path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 28983 1726883023.93836: variable 'ansible_search_path' from source: unknown 28983 1726883024.09644: variable 'ansible_managed' from source: unknown 28983 1726883024.09708: variable 'omit' from source: magic vars 28983 1726883024.10042: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883024.10046: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883024.10049: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883024.10051: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883024.10053: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883024.10085: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883024.10088: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883024.10094: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883024.10457: Set connection var ansible_connection to ssh 28983 1726883024.10472: Set connection var ansible_shell_executable to /bin/sh 28983 1726883024.10487: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883024.10498: Set connection var ansible_timeout to 10 28983 1726883024.10506: Set connection var ansible_pipelining to False 28983 1726883024.10510: Set connection var ansible_shell_type to sh 28983 1726883024.10538: variable 'ansible_shell_executable' from source: unknown 28983 1726883024.10683: variable 'ansible_connection' from source: unknown 28983 1726883024.10686: variable 'ansible_module_compression' from source: unknown 28983 1726883024.10689: variable 'ansible_shell_type' from source: unknown 28983 1726883024.10736: variable 'ansible_shell_executable' from source: unknown 28983 1726883024.10742: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883024.10744: variable 'ansible_pipelining' from source: unknown 28983 1726883024.10747: variable 'ansible_timeout' from source: unknown 28983 1726883024.10749: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883024.11173: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28983 1726883024.11184: variable 'omit' from source: magic vars 28983 1726883024.11187: starting attempt loop 28983 1726883024.11190: running the handler 28983 1726883024.11192: _low_level_execute_command(): starting 28983 1726883024.11248: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726883024.12706: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883024.12724: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883024.12754: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883024.12868: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883024.13119: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883024.13197: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883024.14984: stdout chunk (state=3): >>>/root <<< 28983 1726883024.15177: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883024.15180: stdout chunk (state=3): >>><<< 28983 1726883024.15183: stderr chunk (state=3): >>><<< 28983 1726883024.15372: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883024.15375: _low_level_execute_command(): starting 28983 1726883024.15379: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726883024.1529183-30870-260037965517350 `" && echo ansible-tmp-1726883024.1529183-30870-260037965517350="` echo /root/.ansible/tmp/ansible-tmp-1726883024.1529183-30870-260037965517350 `" ) && sleep 0' 28983 1726883024.16638: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883024.16753: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883024.16967: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883024.17051: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883024.17079: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883024.19197: stdout chunk (state=3): >>>ansible-tmp-1726883024.1529183-30870-260037965517350=/root/.ansible/tmp/ansible-tmp-1726883024.1529183-30870-260037965517350 <<< 28983 1726883024.19380: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883024.19391: stdout chunk (state=3): >>><<< 28983 1726883024.19402: stderr chunk (state=3): >>><<< 28983 1726883024.19448: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726883024.1529183-30870-260037965517350=/root/.ansible/tmp/ansible-tmp-1726883024.1529183-30870-260037965517350 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883024.19641: variable 'ansible_module_compression' from source: unknown 28983 1726883024.19645: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 28983 1726883024.19790: variable 'ansible_facts' from source: unknown 28983 1726883024.20138: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726883024.1529183-30870-260037965517350/AnsiballZ_network_connections.py 28983 1726883024.20580: Sending initial data 28983 1726883024.20591: Sent initial data (168 bytes) 28983 1726883024.21695: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883024.21699: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883024.21702: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883024.21776: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883024.21927: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883024.21943: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883024.22023: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883024.23778: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726883024.23889: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726883024.23965: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmp03j_ik6y /root/.ansible/tmp/ansible-tmp-1726883024.1529183-30870-260037965517350/AnsiballZ_network_connections.py <<< 28983 1726883024.23978: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726883024.1529183-30870-260037965517350/AnsiballZ_network_connections.py" <<< 28983 1726883024.24038: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmp03j_ik6y" to remote "/root/.ansible/tmp/ansible-tmp-1726883024.1529183-30870-260037965517350/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726883024.1529183-30870-260037965517350/AnsiballZ_network_connections.py" <<< 28983 1726883024.27422: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883024.27636: stderr chunk (state=3): >>><<< 28983 1726883024.27641: stdout chunk (state=3): >>><<< 28983 1726883024.27644: done transferring module to remote 28983 1726883024.27646: _low_level_execute_command(): starting 28983 1726883024.27648: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726883024.1529183-30870-260037965517350/ /root/.ansible/tmp/ansible-tmp-1726883024.1529183-30870-260037965517350/AnsiballZ_network_connections.py && sleep 0' 28983 1726883024.28881: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883024.28885: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883024.28966: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883024.28969: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883024.28972: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration <<< 28983 1726883024.28975: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726883024.28977: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883024.29151: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883024.29173: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883024.29176: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883024.29260: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883024.31429: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883024.31432: stderr chunk (state=3): >>><<< 28983 1726883024.31437: stdout chunk (state=3): >>><<< 28983 1726883024.31440: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883024.31442: _low_level_execute_command(): starting 28983 1726883024.31445: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726883024.1529183-30870-260037965517350/AnsiballZ_network_connections.py && sleep 0' 28983 1726883024.32471: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883024.32735: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 28983 1726883024.32741: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883024.32743: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883024.32863: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883024.61495: stdout chunk (state=3): >>> {"changed": false, "warnings": [], "stderr": "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, f251b268-4387-4b61-a766-95deb90f678a skipped because already active\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 28983 1726883024.63389: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. <<< 28983 1726883024.63626: stderr chunk (state=3): >>><<< 28983 1726883024.63630: stdout chunk (state=3): >>><<< 28983 1726883024.63656: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "warnings": [], "stderr": "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, f251b268-4387-4b61-a766-95deb90f678a skipped because already active\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726883024.63754: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'state': 'up'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726883024.1529183-30870-260037965517350/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726883024.63766: _low_level_execute_command(): starting 28983 1726883024.63774: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726883024.1529183-30870-260037965517350/ > /dev/null 2>&1 && sleep 0' 28983 1726883024.65307: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883024.65311: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883024.65313: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883024.65337: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883024.65357: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726883024.65459: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883024.65863: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883024.66109: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883024.68167: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883024.68173: stdout chunk (state=3): >>><<< 28983 1726883024.68176: stderr chunk (state=3): >>><<< 28983 1726883024.68339: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883024.68344: handler run complete 28983 1726883024.68347: attempt loop complete, returning result 28983 1726883024.68350: _execute() done 28983 1726883024.68353: dumping result to json 28983 1726883024.68356: done dumping result, returning 28983 1726883024.68359: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affe814-3a2d-b16d-c0a7-000000000d26] 28983 1726883024.68362: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000d26 28983 1726883024.68455: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000d26 28983 1726883024.68459: WORKER PROCESS EXITING ok: [managed_node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "state": "up" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": false } STDERR: [002] #0, state:up persistent_state:present, 'statebr': up connection statebr, f251b268-4387-4b61-a766-95deb90f678a skipped because already active 28983 1726883024.68694: no more pending results, returning what we have 28983 1726883024.68698: results queue empty 28983 1726883024.68699: checking for any_errors_fatal 28983 1726883024.68707: done checking for any_errors_fatal 28983 1726883024.68708: checking for max_fail_percentage 28983 1726883024.68710: done checking for max_fail_percentage 28983 1726883024.68711: checking to see if all hosts have failed and the running result is not ok 28983 1726883024.68716: done checking to see if all hosts have failed 28983 1726883024.68717: getting the remaining hosts for this loop 28983 1726883024.68719: done getting the remaining hosts for this loop 28983 1726883024.68724: getting the next task for host managed_node2 28983 1726883024.68733: done getting next task for host managed_node2 28983 1726883024.68942: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 28983 1726883024.68953: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883024.68967: getting variables 28983 1726883024.68969: in VariableManager get_vars() 28983 1726883024.69015: Calling all_inventory to load vars for managed_node2 28983 1726883024.69019: Calling groups_inventory to load vars for managed_node2 28983 1726883024.69022: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883024.69032: Calling all_plugins_play to load vars for managed_node2 28983 1726883024.69039: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883024.69043: Calling groups_plugins_play to load vars for managed_node2 28983 1726883024.76684: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883024.85377: done with get_vars() 28983 1726883024.85433: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:43:44 -0400 (0:00:01.027) 0:00:54.855 ****** 28983 1726883024.85754: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 28983 1726883024.86564: worker is 1 (out of 1 available) 28983 1726883024.86582: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 28983 1726883024.86596: done queuing things up, now waiting for results queue to drain 28983 1726883024.86598: waiting for pending results... 28983 1726883024.87248: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 28983 1726883024.87843: in run() - task 0affe814-3a2d-b16d-c0a7-000000000d27 28983 1726883024.87847: variable 'ansible_search_path' from source: unknown 28983 1726883024.87850: variable 'ansible_search_path' from source: unknown 28983 1726883024.87853: calling self._execute() 28983 1726883024.88022: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883024.88169: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883024.88175: variable 'omit' from source: magic vars 28983 1726883024.89015: variable 'ansible_distribution_major_version' from source: facts 28983 1726883024.89244: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883024.89425: variable 'network_state' from source: role '' defaults 28983 1726883024.89495: Evaluated conditional (network_state != {}): False 28983 1726883024.89505: when evaluation is False, skipping this task 28983 1726883024.89513: _execute() done 28983 1726883024.89698: dumping result to json 28983 1726883024.89701: done dumping result, returning 28983 1726883024.89704: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [0affe814-3a2d-b16d-c0a7-000000000d27] 28983 1726883024.89707: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000d27 28983 1726883024.89789: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000d27 28983 1726883024.89793: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28983 1726883024.89864: no more pending results, returning what we have 28983 1726883024.89869: results queue empty 28983 1726883024.89870: checking for any_errors_fatal 28983 1726883024.89885: done checking for any_errors_fatal 28983 1726883024.89886: checking for max_fail_percentage 28983 1726883024.89889: done checking for max_fail_percentage 28983 1726883024.89891: checking to see if all hosts have failed and the running result is not ok 28983 1726883024.89892: done checking to see if all hosts have failed 28983 1726883024.89893: getting the remaining hosts for this loop 28983 1726883024.89895: done getting the remaining hosts for this loop 28983 1726883024.89900: getting the next task for host managed_node2 28983 1726883024.89910: done getting next task for host managed_node2 28983 1726883024.89914: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 28983 1726883024.89922: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883024.89950: getting variables 28983 1726883024.89952: in VariableManager get_vars() 28983 1726883024.89999: Calling all_inventory to load vars for managed_node2 28983 1726883024.90003: Calling groups_inventory to load vars for managed_node2 28983 1726883024.90006: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883024.90019: Calling all_plugins_play to load vars for managed_node2 28983 1726883024.90024: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883024.90028: Calling groups_plugins_play to load vars for managed_node2 28983 1726883024.95314: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883024.98830: done with get_vars() 28983 1726883024.98880: done getting variables 28983 1726883024.98954: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:43:44 -0400 (0:00:00.132) 0:00:54.987 ****** 28983 1726883024.99000: entering _queue_task() for managed_node2/debug 28983 1726883024.99830: worker is 1 (out of 1 available) 28983 1726883024.99857: exiting _queue_task() for managed_node2/debug 28983 1726883024.99871: done queuing things up, now waiting for results queue to drain 28983 1726883024.99873: waiting for pending results... 28983 1726883025.00199: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 28983 1726883025.00587: in run() - task 0affe814-3a2d-b16d-c0a7-000000000d28 28983 1726883025.00615: variable 'ansible_search_path' from source: unknown 28983 1726883025.00624: variable 'ansible_search_path' from source: unknown 28983 1726883025.00752: calling self._execute() 28983 1726883025.00818: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883025.00833: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883025.00855: variable 'omit' from source: magic vars 28983 1726883025.01335: variable 'ansible_distribution_major_version' from source: facts 28983 1726883025.01360: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883025.01369: variable 'omit' from source: magic vars 28983 1726883025.01530: variable 'omit' from source: magic vars 28983 1726883025.01535: variable 'omit' from source: magic vars 28983 1726883025.01572: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883025.01631: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883025.01652: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883025.01673: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883025.01698: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883025.01739: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883025.01742: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883025.01750: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883025.01891: Set connection var ansible_connection to ssh 28983 1726883025.01921: Set connection var ansible_shell_executable to /bin/sh 28983 1726883025.01937: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883025.01947: Set connection var ansible_timeout to 10 28983 1726883025.01956: Set connection var ansible_pipelining to False 28983 1726883025.01959: Set connection var ansible_shell_type to sh 28983 1726883025.01992: variable 'ansible_shell_executable' from source: unknown 28983 1726883025.01995: variable 'ansible_connection' from source: unknown 28983 1726883025.01998: variable 'ansible_module_compression' from source: unknown 28983 1726883025.02001: variable 'ansible_shell_type' from source: unknown 28983 1726883025.02006: variable 'ansible_shell_executable' from source: unknown 28983 1726883025.02008: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883025.02040: variable 'ansible_pipelining' from source: unknown 28983 1726883025.02043: variable 'ansible_timeout' from source: unknown 28983 1726883025.02045: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883025.02222: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883025.02296: variable 'omit' from source: magic vars 28983 1726883025.02299: starting attempt loop 28983 1726883025.02302: running the handler 28983 1726883025.02425: variable '__network_connections_result' from source: set_fact 28983 1726883025.02499: handler run complete 28983 1726883025.02526: attempt loop complete, returning result 28983 1726883025.02529: _execute() done 28983 1726883025.02532: dumping result to json 28983 1726883025.02537: done dumping result, returning 28983 1726883025.02547: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affe814-3a2d-b16d-c0a7-000000000d28] 28983 1726883025.02552: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000d28 28983 1726883025.02692: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000d28 28983 1726883025.02695: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, f251b268-4387-4b61-a766-95deb90f678a skipped because already active" ] } 28983 1726883025.02788: no more pending results, returning what we have 28983 1726883025.02792: results queue empty 28983 1726883025.02793: checking for any_errors_fatal 28983 1726883025.02801: done checking for any_errors_fatal 28983 1726883025.02802: checking for max_fail_percentage 28983 1726883025.02804: done checking for max_fail_percentage 28983 1726883025.02805: checking to see if all hosts have failed and the running result is not ok 28983 1726883025.02806: done checking to see if all hosts have failed 28983 1726883025.02807: getting the remaining hosts for this loop 28983 1726883025.02809: done getting the remaining hosts for this loop 28983 1726883025.02814: getting the next task for host managed_node2 28983 1726883025.02822: done getting next task for host managed_node2 28983 1726883025.02826: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 28983 1726883025.02832: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883025.02846: getting variables 28983 1726883025.02848: in VariableManager get_vars() 28983 1726883025.02885: Calling all_inventory to load vars for managed_node2 28983 1726883025.02889: Calling groups_inventory to load vars for managed_node2 28983 1726883025.02892: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883025.02901: Calling all_plugins_play to load vars for managed_node2 28983 1726883025.02905: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883025.02908: Calling groups_plugins_play to load vars for managed_node2 28983 1726883025.04273: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883025.07925: done with get_vars() 28983 1726883025.07954: done getting variables 28983 1726883025.08006: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:43:45 -0400 (0:00:00.090) 0:00:55.078 ****** 28983 1726883025.08060: entering _queue_task() for managed_node2/debug 28983 1726883025.08362: worker is 1 (out of 1 available) 28983 1726883025.08376: exiting _queue_task() for managed_node2/debug 28983 1726883025.08390: done queuing things up, now waiting for results queue to drain 28983 1726883025.08392: waiting for pending results... 28983 1726883025.08630: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 28983 1726883025.08842: in run() - task 0affe814-3a2d-b16d-c0a7-000000000d29 28983 1726883025.08846: variable 'ansible_search_path' from source: unknown 28983 1726883025.08850: variable 'ansible_search_path' from source: unknown 28983 1726883025.08853: calling self._execute() 28983 1726883025.08944: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883025.08951: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883025.08966: variable 'omit' from source: magic vars 28983 1726883025.09410: variable 'ansible_distribution_major_version' from source: facts 28983 1726883025.09464: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883025.09468: variable 'omit' from source: magic vars 28983 1726883025.09516: variable 'omit' from source: magic vars 28983 1726883025.09560: variable 'omit' from source: magic vars 28983 1726883025.09610: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883025.09666: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883025.09839: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883025.09843: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883025.09845: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883025.09848: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883025.09851: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883025.09853: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883025.09913: Set connection var ansible_connection to ssh 28983 1726883025.09927: Set connection var ansible_shell_executable to /bin/sh 28983 1726883025.09940: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883025.09952: Set connection var ansible_timeout to 10 28983 1726883025.09960: Set connection var ansible_pipelining to False 28983 1726883025.09977: Set connection var ansible_shell_type to sh 28983 1726883025.09993: variable 'ansible_shell_executable' from source: unknown 28983 1726883025.09996: variable 'ansible_connection' from source: unknown 28983 1726883025.10083: variable 'ansible_module_compression' from source: unknown 28983 1726883025.10087: variable 'ansible_shell_type' from source: unknown 28983 1726883025.10090: variable 'ansible_shell_executable' from source: unknown 28983 1726883025.10092: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883025.10094: variable 'ansible_pipelining' from source: unknown 28983 1726883025.10096: variable 'ansible_timeout' from source: unknown 28983 1726883025.10099: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883025.10190: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883025.10201: variable 'omit' from source: magic vars 28983 1726883025.10298: starting attempt loop 28983 1726883025.10301: running the handler 28983 1726883025.10305: variable '__network_connections_result' from source: set_fact 28983 1726883025.10361: variable '__network_connections_result' from source: set_fact 28983 1726883025.10495: handler run complete 28983 1726883025.10528: attempt loop complete, returning result 28983 1726883025.10532: _execute() done 28983 1726883025.10542: dumping result to json 28983 1726883025.10545: done dumping result, returning 28983 1726883025.10552: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affe814-3a2d-b16d-c0a7-000000000d29] 28983 1726883025.10558: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000d29 ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "state": "up" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": false, "failed": false, "stderr": "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, f251b268-4387-4b61-a766-95deb90f678a skipped because already active\n", "stderr_lines": [ "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, f251b268-4387-4b61-a766-95deb90f678a skipped because already active" ] } } 28983 1726883025.10871: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000d29 28983 1726883025.10874: WORKER PROCESS EXITING 28983 1726883025.10890: no more pending results, returning what we have 28983 1726883025.10893: results queue empty 28983 1726883025.10894: checking for any_errors_fatal 28983 1726883025.10901: done checking for any_errors_fatal 28983 1726883025.10902: checking for max_fail_percentage 28983 1726883025.10904: done checking for max_fail_percentage 28983 1726883025.10905: checking to see if all hosts have failed and the running result is not ok 28983 1726883025.10905: done checking to see if all hosts have failed 28983 1726883025.10906: getting the remaining hosts for this loop 28983 1726883025.10908: done getting the remaining hosts for this loop 28983 1726883025.10911: getting the next task for host managed_node2 28983 1726883025.10918: done getting next task for host managed_node2 28983 1726883025.10922: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 28983 1726883025.10927: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883025.10943: getting variables 28983 1726883025.10944: in VariableManager get_vars() 28983 1726883025.10991: Calling all_inventory to load vars for managed_node2 28983 1726883025.10995: Calling groups_inventory to load vars for managed_node2 28983 1726883025.11002: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883025.11012: Calling all_plugins_play to load vars for managed_node2 28983 1726883025.11015: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883025.11020: Calling groups_plugins_play to load vars for managed_node2 28983 1726883025.13629: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883025.15727: done with get_vars() 28983 1726883025.15756: done getting variables 28983 1726883025.15806: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:43:45 -0400 (0:00:00.077) 0:00:55.156 ****** 28983 1726883025.15837: entering _queue_task() for managed_node2/debug 28983 1726883025.16083: worker is 1 (out of 1 available) 28983 1726883025.16098: exiting _queue_task() for managed_node2/debug 28983 1726883025.16112: done queuing things up, now waiting for results queue to drain 28983 1726883025.16114: waiting for pending results... 28983 1726883025.16320: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 28983 1726883025.16443: in run() - task 0affe814-3a2d-b16d-c0a7-000000000d2a 28983 1726883025.16459: variable 'ansible_search_path' from source: unknown 28983 1726883025.16462: variable 'ansible_search_path' from source: unknown 28983 1726883025.16500: calling self._execute() 28983 1726883025.16594: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883025.16600: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883025.16612: variable 'omit' from source: magic vars 28983 1726883025.16965: variable 'ansible_distribution_major_version' from source: facts 28983 1726883025.16985: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883025.17166: variable 'network_state' from source: role '' defaults 28983 1726883025.17217: Evaluated conditional (network_state != {}): False 28983 1726883025.17222: when evaluation is False, skipping this task 28983 1726883025.17226: _execute() done 28983 1726883025.17228: dumping result to json 28983 1726883025.17231: done dumping result, returning 28983 1726883025.17235: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affe814-3a2d-b16d-c0a7-000000000d2a] 28983 1726883025.17239: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000d2a 28983 1726883025.17605: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000d2a 28983 1726883025.17609: WORKER PROCESS EXITING skipping: [managed_node2] => { "false_condition": "network_state != {}" } 28983 1726883025.17661: no more pending results, returning what we have 28983 1726883025.17664: results queue empty 28983 1726883025.17665: checking for any_errors_fatal 28983 1726883025.17671: done checking for any_errors_fatal 28983 1726883025.17672: checking for max_fail_percentage 28983 1726883025.17674: done checking for max_fail_percentage 28983 1726883025.17675: checking to see if all hosts have failed and the running result is not ok 28983 1726883025.17676: done checking to see if all hosts have failed 28983 1726883025.17677: getting the remaining hosts for this loop 28983 1726883025.17678: done getting the remaining hosts for this loop 28983 1726883025.17683: getting the next task for host managed_node2 28983 1726883025.17690: done getting next task for host managed_node2 28983 1726883025.17696: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 28983 1726883025.17702: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883025.17726: getting variables 28983 1726883025.17729: in VariableManager get_vars() 28983 1726883025.17770: Calling all_inventory to load vars for managed_node2 28983 1726883025.17773: Calling groups_inventory to load vars for managed_node2 28983 1726883025.17777: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883025.17787: Calling all_plugins_play to load vars for managed_node2 28983 1726883025.17790: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883025.17794: Calling groups_plugins_play to load vars for managed_node2 28983 1726883025.20856: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883025.23919: done with get_vars() 28983 1726883025.23963: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:43:45 -0400 (0:00:00.082) 0:00:55.238 ****** 28983 1726883025.24107: entering _queue_task() for managed_node2/ping 28983 1726883025.24404: worker is 1 (out of 1 available) 28983 1726883025.24418: exiting _queue_task() for managed_node2/ping 28983 1726883025.24432: done queuing things up, now waiting for results queue to drain 28983 1726883025.24435: waiting for pending results... 28983 1726883025.24644: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 28983 1726883025.24764: in run() - task 0affe814-3a2d-b16d-c0a7-000000000d2b 28983 1726883025.24779: variable 'ansible_search_path' from source: unknown 28983 1726883025.24783: variable 'ansible_search_path' from source: unknown 28983 1726883025.24816: calling self._execute() 28983 1726883025.24906: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883025.24912: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883025.24923: variable 'omit' from source: magic vars 28983 1726883025.25373: variable 'ansible_distribution_major_version' from source: facts 28983 1726883025.25387: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883025.25393: variable 'omit' from source: magic vars 28983 1726883025.25456: variable 'omit' from source: magic vars 28983 1726883025.25490: variable 'omit' from source: magic vars 28983 1726883025.25548: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883025.25602: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883025.25616: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883025.25656: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883025.25663: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883025.25709: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883025.25713: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883025.25715: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883025.25840: Set connection var ansible_connection to ssh 28983 1726883025.25851: Set connection var ansible_shell_executable to /bin/sh 28983 1726883025.25859: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883025.25868: Set connection var ansible_timeout to 10 28983 1726883025.25877: Set connection var ansible_pipelining to False 28983 1726883025.25880: Set connection var ansible_shell_type to sh 28983 1726883025.25899: variable 'ansible_shell_executable' from source: unknown 28983 1726883025.25910: variable 'ansible_connection' from source: unknown 28983 1726883025.25916: variable 'ansible_module_compression' from source: unknown 28983 1726883025.25919: variable 'ansible_shell_type' from source: unknown 28983 1726883025.25926: variable 'ansible_shell_executable' from source: unknown 28983 1726883025.25928: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883025.25930: variable 'ansible_pipelining' from source: unknown 28983 1726883025.25932: variable 'ansible_timeout' from source: unknown 28983 1726883025.25936: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883025.26192: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28983 1726883025.26197: variable 'omit' from source: magic vars 28983 1726883025.26200: starting attempt loop 28983 1726883025.26205: running the handler 28983 1726883025.26247: _low_level_execute_command(): starting 28983 1726883025.26251: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726883025.26918: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883025.26940: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883025.26969: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726883025.26974: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883025.27031: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883025.27038: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883025.27076: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883025.27175: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883025.28975: stdout chunk (state=3): >>>/root <<< 28983 1726883025.29100: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883025.29146: stderr chunk (state=3): >>><<< 28983 1726883025.29149: stdout chunk (state=3): >>><<< 28983 1726883025.29181: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883025.29196: _low_level_execute_command(): starting 28983 1726883025.29200: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726883025.291761-30925-107164437775568 `" && echo ansible-tmp-1726883025.291761-30925-107164437775568="` echo /root/.ansible/tmp/ansible-tmp-1726883025.291761-30925-107164437775568 `" ) && sleep 0' 28983 1726883025.29718: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883025.29722: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883025.29725: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883025.29732: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883025.29794: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883025.29797: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883025.29861: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883025.31909: stdout chunk (state=3): >>>ansible-tmp-1726883025.291761-30925-107164437775568=/root/.ansible/tmp/ansible-tmp-1726883025.291761-30925-107164437775568 <<< 28983 1726883025.32031: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883025.32082: stderr chunk (state=3): >>><<< 28983 1726883025.32085: stdout chunk (state=3): >>><<< 28983 1726883025.32099: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726883025.291761-30925-107164437775568=/root/.ansible/tmp/ansible-tmp-1726883025.291761-30925-107164437775568 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883025.32132: variable 'ansible_module_compression' from source: unknown 28983 1726883025.32170: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 28983 1726883025.32195: variable 'ansible_facts' from source: unknown 28983 1726883025.32248: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726883025.291761-30925-107164437775568/AnsiballZ_ping.py 28983 1726883025.32358: Sending initial data 28983 1726883025.32361: Sent initial data (152 bytes) 28983 1726883025.32802: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883025.32805: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726883025.32808: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883025.32811: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883025.32813: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883025.32867: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883025.32871: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883025.32949: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883025.34563: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 28983 1726883025.34573: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726883025.34628: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726883025.34701: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmp5ccpwbs0 /root/.ansible/tmp/ansible-tmp-1726883025.291761-30925-107164437775568/AnsiballZ_ping.py <<< 28983 1726883025.34709: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726883025.291761-30925-107164437775568/AnsiballZ_ping.py" <<< 28983 1726883025.34772: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmp5ccpwbs0" to remote "/root/.ansible/tmp/ansible-tmp-1726883025.291761-30925-107164437775568/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726883025.291761-30925-107164437775568/AnsiballZ_ping.py" <<< 28983 1726883025.35650: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883025.35731: stderr chunk (state=3): >>><<< 28983 1726883025.35736: stdout chunk (state=3): >>><<< 28983 1726883025.35758: done transferring module to remote 28983 1726883025.35780: _low_level_execute_command(): starting 28983 1726883025.35783: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726883025.291761-30925-107164437775568/ /root/.ansible/tmp/ansible-tmp-1726883025.291761-30925-107164437775568/AnsiballZ_ping.py && sleep 0' 28983 1726883025.36366: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883025.36369: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726883025.36372: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 28983 1726883025.36375: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found <<< 28983 1726883025.36378: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883025.36437: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883025.36441: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883025.36507: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883025.38383: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883025.38424: stderr chunk (state=3): >>><<< 28983 1726883025.38427: stdout chunk (state=3): >>><<< 28983 1726883025.38444: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883025.38447: _low_level_execute_command(): starting 28983 1726883025.38454: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726883025.291761-30925-107164437775568/AnsiballZ_ping.py && sleep 0' 28983 1726883025.38930: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883025.39027: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883025.39033: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883025.39039: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883025.39042: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883025.39089: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883025.39155: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883025.56080: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 28983 1726883025.57545: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. <<< 28983 1726883025.57581: stderr chunk (state=3): >>><<< 28983 1726883025.57609: stdout chunk (state=3): >>><<< 28983 1726883025.57620: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726883025.57649: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726883025.291761-30925-107164437775568/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726883025.57662: _low_level_execute_command(): starting 28983 1726883025.57668: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726883025.291761-30925-107164437775568/ > /dev/null 2>&1 && sleep 0' 28983 1726883025.58451: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883025.58454: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883025.58457: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883025.58479: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883025.58600: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883025.60551: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883025.60748: stderr chunk (state=3): >>><<< 28983 1726883025.60756: stdout chunk (state=3): >>><<< 28983 1726883025.60932: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883025.60944: handler run complete 28983 1726883025.60948: attempt loop complete, returning result 28983 1726883025.60950: _execute() done 28983 1726883025.60953: dumping result to json 28983 1726883025.60955: done dumping result, returning 28983 1726883025.60957: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affe814-3a2d-b16d-c0a7-000000000d2b] 28983 1726883025.60959: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000d2b ok: [managed_node2] => { "changed": false, "ping": "pong" } 28983 1726883025.61132: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000d2b 28983 1726883025.61138: WORKER PROCESS EXITING 28983 1726883025.61175: no more pending results, returning what we have 28983 1726883025.61180: results queue empty 28983 1726883025.61181: checking for any_errors_fatal 28983 1726883025.61191: done checking for any_errors_fatal 28983 1726883025.61192: checking for max_fail_percentage 28983 1726883025.61195: done checking for max_fail_percentage 28983 1726883025.61196: checking to see if all hosts have failed and the running result is not ok 28983 1726883025.61197: done checking to see if all hosts have failed 28983 1726883025.61202: getting the remaining hosts for this loop 28983 1726883025.61205: done getting the remaining hosts for this loop 28983 1726883025.61209: getting the next task for host managed_node2 28983 1726883025.61227: done getting next task for host managed_node2 28983 1726883025.61230: ^ task is: TASK: meta (role_complete) 28983 1726883025.61237: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883025.61274: getting variables 28983 1726883025.61276: in VariableManager get_vars() 28983 1726883025.61323: Calling all_inventory to load vars for managed_node2 28983 1726883025.61326: Calling groups_inventory to load vars for managed_node2 28983 1726883025.61329: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883025.61341: Calling all_plugins_play to load vars for managed_node2 28983 1726883025.61344: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883025.61348: Calling groups_plugins_play to load vars for managed_node2 28983 1726883025.63138: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883025.68492: done with get_vars() 28983 1726883025.68745: done getting variables 28983 1726883025.68860: done queuing things up, now waiting for results queue to drain 28983 1726883025.68862: results queue empty 28983 1726883025.68864: checking for any_errors_fatal 28983 1726883025.68868: done checking for any_errors_fatal 28983 1726883025.68869: checking for max_fail_percentage 28983 1726883025.68873: done checking for max_fail_percentage 28983 1726883025.68874: checking to see if all hosts have failed and the running result is not ok 28983 1726883025.68875: done checking to see if all hosts have failed 28983 1726883025.68876: getting the remaining hosts for this loop 28983 1726883025.68877: done getting the remaining hosts for this loop 28983 1726883025.68881: getting the next task for host managed_node2 28983 1726883025.68888: done getting next task for host managed_node2 28983 1726883025.68891: ^ task is: TASK: Asserts 28983 1726883025.68894: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883025.68897: getting variables 28983 1726883025.68899: in VariableManager get_vars() 28983 1726883025.68912: Calling all_inventory to load vars for managed_node2 28983 1726883025.68915: Calling groups_inventory to load vars for managed_node2 28983 1726883025.68918: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883025.68924: Calling all_plugins_play to load vars for managed_node2 28983 1726883025.68927: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883025.68931: Calling groups_plugins_play to load vars for managed_node2 28983 1726883025.73758: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883025.79405: done with get_vars() 28983 1726883025.79468: done getting variables TASK [Asserts] ***************************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:36 Friday 20 September 2024 21:43:45 -0400 (0:00:00.554) 0:00:55.793 ****** 28983 1726883025.79581: entering _queue_task() for managed_node2/include_tasks 28983 1726883025.80307: worker is 1 (out of 1 available) 28983 1726883025.80320: exiting _queue_task() for managed_node2/include_tasks 28983 1726883025.80340: done queuing things up, now waiting for results queue to drain 28983 1726883025.80342: waiting for pending results... 28983 1726883025.80854: running TaskExecutor() for managed_node2/TASK: Asserts 28983 1726883025.80859: in run() - task 0affe814-3a2d-b16d-c0a7-000000000a4e 28983 1726883025.80862: variable 'ansible_search_path' from source: unknown 28983 1726883025.80865: variable 'ansible_search_path' from source: unknown 28983 1726883025.80868: variable 'lsr_assert' from source: include params 28983 1726883025.81216: variable 'lsr_assert' from source: include params 28983 1726883025.81329: variable 'omit' from source: magic vars 28983 1726883025.81532: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883025.81550: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883025.81641: variable 'omit' from source: magic vars 28983 1726883025.82140: variable 'ansible_distribution_major_version' from source: facts 28983 1726883025.82144: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883025.82148: variable 'item' from source: unknown 28983 1726883025.82150: variable 'item' from source: unknown 28983 1726883025.82154: variable 'item' from source: unknown 28983 1726883025.82339: variable 'item' from source: unknown 28983 1726883025.82461: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883025.82465: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883025.82469: variable 'omit' from source: magic vars 28983 1726883025.82603: variable 'ansible_distribution_major_version' from source: facts 28983 1726883025.82939: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883025.82943: variable 'item' from source: unknown 28983 1726883025.82946: variable 'item' from source: unknown 28983 1726883025.82948: variable 'item' from source: unknown 28983 1726883025.82949: variable 'item' from source: unknown 28983 1726883025.83000: dumping result to json 28983 1726883025.83003: done dumping result, returning 28983 1726883025.83005: done running TaskExecutor() for managed_node2/TASK: Asserts [0affe814-3a2d-b16d-c0a7-000000000a4e] 28983 1726883025.83007: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000a4e 28983 1726883025.83046: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000a4e 28983 1726883025.83049: WORKER PROCESS EXITING 28983 1726883025.83088: no more pending results, returning what we have 28983 1726883025.83095: in VariableManager get_vars() 28983 1726883025.83243: Calling all_inventory to load vars for managed_node2 28983 1726883025.83248: Calling groups_inventory to load vars for managed_node2 28983 1726883025.83253: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883025.83266: Calling all_plugins_play to load vars for managed_node2 28983 1726883025.83273: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883025.83278: Calling groups_plugins_play to load vars for managed_node2 28983 1726883025.88875: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883025.94830: done with get_vars() 28983 1726883025.94877: variable 'ansible_search_path' from source: unknown 28983 1726883025.94878: variable 'ansible_search_path' from source: unknown 28983 1726883025.94929: variable 'ansible_search_path' from source: unknown 28983 1726883025.94930: variable 'ansible_search_path' from source: unknown 28983 1726883025.94982: we have included files to process 28983 1726883025.94983: generating all_blocks data 28983 1726883025.94986: done generating all_blocks data 28983 1726883025.94992: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 28983 1726883025.94993: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 28983 1726883025.94996: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 28983 1726883025.95146: in VariableManager get_vars() 28983 1726883025.95177: done with get_vars() 28983 1726883025.95339: done processing included file 28983 1726883025.95342: iterating over new_blocks loaded from include file 28983 1726883025.95343: in VariableManager get_vars() 28983 1726883025.95362: done with get_vars() 28983 1726883025.95364: filtering new block on tags 28983 1726883025.95421: done filtering new block on tags 28983 1726883025.95425: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed_node2 => (item=tasks/assert_device_present.yml) 28983 1726883025.95431: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 28983 1726883025.95432: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 28983 1726883025.95437: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 28983 1726883025.95589: in VariableManager get_vars() 28983 1726883025.95618: done with get_vars() 28983 1726883025.96142: done processing included file 28983 1726883025.96190: iterating over new_blocks loaded from include file 28983 1726883025.96192: in VariableManager get_vars() 28983 1726883025.96211: done with get_vars() 28983 1726883025.96213: filtering new block on tags 28983 1726883025.96355: done filtering new block on tags 28983 1726883025.96358: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node2 => (item=tasks/assert_profile_present.yml) 28983 1726883025.96422: extending task lists for all hosts with included blocks 28983 1726883025.98778: done extending task lists 28983 1726883025.98780: done processing included files 28983 1726883025.98781: results queue empty 28983 1726883025.98782: checking for any_errors_fatal 28983 1726883025.98784: done checking for any_errors_fatal 28983 1726883025.98785: checking for max_fail_percentage 28983 1726883025.98786: done checking for max_fail_percentage 28983 1726883025.98787: checking to see if all hosts have failed and the running result is not ok 28983 1726883025.98788: done checking to see if all hosts have failed 28983 1726883025.98789: getting the remaining hosts for this loop 28983 1726883025.98791: done getting the remaining hosts for this loop 28983 1726883025.98794: getting the next task for host managed_node2 28983 1726883025.98799: done getting next task for host managed_node2 28983 1726883025.98802: ^ task is: TASK: Include the task 'get_interface_stat.yml' 28983 1726883025.98806: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883025.98813: getting variables 28983 1726883025.98815: in VariableManager get_vars() 28983 1726883025.98826: Calling all_inventory to load vars for managed_node2 28983 1726883025.98829: Calling groups_inventory to load vars for managed_node2 28983 1726883025.98832: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883025.98840: Calling all_plugins_play to load vars for managed_node2 28983 1726883025.98843: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883025.98847: Calling groups_plugins_play to load vars for managed_node2 28983 1726883026.03473: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883026.08511: done with get_vars() 28983 1726883026.08611: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 21:43:46 -0400 (0:00:00.292) 0:00:56.086 ****** 28983 1726883026.08918: entering _queue_task() for managed_node2/include_tasks 28983 1726883026.09764: worker is 1 (out of 1 available) 28983 1726883026.09777: exiting _queue_task() for managed_node2/include_tasks 28983 1726883026.09850: done queuing things up, now waiting for results queue to drain 28983 1726883026.09853: waiting for pending results... 28983 1726883026.10120: running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' 28983 1726883026.10268: in run() - task 0affe814-3a2d-b16d-c0a7-000000000e86 28983 1726883026.10289: variable 'ansible_search_path' from source: unknown 28983 1726883026.10293: variable 'ansible_search_path' from source: unknown 28983 1726883026.10339: calling self._execute() 28983 1726883026.10465: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883026.10476: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883026.10488: variable 'omit' from source: magic vars 28983 1726883026.10997: variable 'ansible_distribution_major_version' from source: facts 28983 1726883026.11017: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883026.11023: _execute() done 28983 1726883026.11026: dumping result to json 28983 1726883026.11032: done dumping result, returning 28983 1726883026.11044: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' [0affe814-3a2d-b16d-c0a7-000000000e86] 28983 1726883026.11051: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000e86 28983 1726883026.11239: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000e86 28983 1726883026.11244: WORKER PROCESS EXITING 28983 1726883026.11287: no more pending results, returning what we have 28983 1726883026.11292: in VariableManager get_vars() 28983 1726883026.11340: Calling all_inventory to load vars for managed_node2 28983 1726883026.11348: Calling groups_inventory to load vars for managed_node2 28983 1726883026.11353: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883026.11367: Calling all_plugins_play to load vars for managed_node2 28983 1726883026.11372: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883026.11376: Calling groups_plugins_play to load vars for managed_node2 28983 1726883026.16472: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883026.20540: done with get_vars() 28983 1726883026.20578: variable 'ansible_search_path' from source: unknown 28983 1726883026.20580: variable 'ansible_search_path' from source: unknown 28983 1726883026.20594: variable 'item' from source: include params 28983 1726883026.20731: variable 'item' from source: include params 28983 1726883026.20943: we have included files to process 28983 1726883026.20945: generating all_blocks data 28983 1726883026.20947: done generating all_blocks data 28983 1726883026.20948: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 28983 1726883026.20950: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 28983 1726883026.20953: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 28983 1726883026.21299: done processing included file 28983 1726883026.21302: iterating over new_blocks loaded from include file 28983 1726883026.21304: in VariableManager get_vars() 28983 1726883026.21326: done with get_vars() 28983 1726883026.21328: filtering new block on tags 28983 1726883026.21377: done filtering new block on tags 28983 1726883026.21380: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node2 28983 1726883026.21387: extending task lists for all hosts with included blocks 28983 1726883026.21629: done extending task lists 28983 1726883026.21631: done processing included files 28983 1726883026.21632: results queue empty 28983 1726883026.21633: checking for any_errors_fatal 28983 1726883026.21640: done checking for any_errors_fatal 28983 1726883026.21641: checking for max_fail_percentage 28983 1726883026.21642: done checking for max_fail_percentage 28983 1726883026.21643: checking to see if all hosts have failed and the running result is not ok 28983 1726883026.21644: done checking to see if all hosts have failed 28983 1726883026.21645: getting the remaining hosts for this loop 28983 1726883026.21647: done getting the remaining hosts for this loop 28983 1726883026.21650: getting the next task for host managed_node2 28983 1726883026.21656: done getting next task for host managed_node2 28983 1726883026.21659: ^ task is: TASK: Get stat for interface {{ interface }} 28983 1726883026.21663: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883026.21666: getting variables 28983 1726883026.21667: in VariableManager get_vars() 28983 1726883026.21680: Calling all_inventory to load vars for managed_node2 28983 1726883026.21683: Calling groups_inventory to load vars for managed_node2 28983 1726883026.21686: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883026.21700: Calling all_plugins_play to load vars for managed_node2 28983 1726883026.21704: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883026.21708: Calling groups_plugins_play to load vars for managed_node2 28983 1726883026.30323: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883026.32476: done with get_vars() 28983 1726883026.32500: done getting variables 28983 1726883026.32606: variable 'interface' from source: play vars TASK [Get stat for interface statebr] ****************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:43:46 -0400 (0:00:00.237) 0:00:56.324 ****** 28983 1726883026.32628: entering _queue_task() for managed_node2/stat 28983 1726883026.32904: worker is 1 (out of 1 available) 28983 1726883026.32919: exiting _queue_task() for managed_node2/stat 28983 1726883026.32931: done queuing things up, now waiting for results queue to drain 28983 1726883026.32939: waiting for pending results... 28983 1726883026.33131: running TaskExecutor() for managed_node2/TASK: Get stat for interface statebr 28983 1726883026.33249: in run() - task 0affe814-3a2d-b16d-c0a7-000000000ef5 28983 1726883026.33264: variable 'ansible_search_path' from source: unknown 28983 1726883026.33269: variable 'ansible_search_path' from source: unknown 28983 1726883026.33308: calling self._execute() 28983 1726883026.33401: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883026.33408: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883026.33418: variable 'omit' from source: magic vars 28983 1726883026.33761: variable 'ansible_distribution_major_version' from source: facts 28983 1726883026.33771: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883026.33780: variable 'omit' from source: magic vars 28983 1726883026.33832: variable 'omit' from source: magic vars 28983 1726883026.33921: variable 'interface' from source: play vars 28983 1726883026.33942: variable 'omit' from source: magic vars 28983 1726883026.33984: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883026.34015: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883026.34033: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883026.34056: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883026.34067: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883026.34100: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883026.34103: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883026.34108: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883026.34194: Set connection var ansible_connection to ssh 28983 1726883026.34204: Set connection var ansible_shell_executable to /bin/sh 28983 1726883026.34213: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883026.34222: Set connection var ansible_timeout to 10 28983 1726883026.34228: Set connection var ansible_pipelining to False 28983 1726883026.34231: Set connection var ansible_shell_type to sh 28983 1726883026.34254: variable 'ansible_shell_executable' from source: unknown 28983 1726883026.34259: variable 'ansible_connection' from source: unknown 28983 1726883026.34262: variable 'ansible_module_compression' from source: unknown 28983 1726883026.34264: variable 'ansible_shell_type' from source: unknown 28983 1726883026.34266: variable 'ansible_shell_executable' from source: unknown 28983 1726883026.34276: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883026.34279: variable 'ansible_pipelining' from source: unknown 28983 1726883026.34287: variable 'ansible_timeout' from source: unknown 28983 1726883026.34290: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883026.34459: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28983 1726883026.34469: variable 'omit' from source: magic vars 28983 1726883026.34478: starting attempt loop 28983 1726883026.34482: running the handler 28983 1726883026.34504: _low_level_execute_command(): starting 28983 1726883026.34507: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726883026.35163: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883026.35231: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883026.35253: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883026.35277: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883026.35385: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883026.37161: stdout chunk (state=3): >>>/root <<< 28983 1726883026.37272: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883026.37323: stderr chunk (state=3): >>><<< 28983 1726883026.37327: stdout chunk (state=3): >>><<< 28983 1726883026.37350: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883026.37361: _low_level_execute_command(): starting 28983 1726883026.37369: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726883026.3734968-30964-103159213989419 `" && echo ansible-tmp-1726883026.3734968-30964-103159213989419="` echo /root/.ansible/tmp/ansible-tmp-1726883026.3734968-30964-103159213989419 `" ) && sleep 0' 28983 1726883026.37815: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883026.37819: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726883026.37822: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration <<< 28983 1726883026.37833: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found <<< 28983 1726883026.37843: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883026.37882: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883026.37885: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883026.37966: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883026.39985: stdout chunk (state=3): >>>ansible-tmp-1726883026.3734968-30964-103159213989419=/root/.ansible/tmp/ansible-tmp-1726883026.3734968-30964-103159213989419 <<< 28983 1726883026.40161: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883026.40165: stdout chunk (state=3): >>><<< 28983 1726883026.40167: stderr chunk (state=3): >>><<< 28983 1726883026.40340: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726883026.3734968-30964-103159213989419=/root/.ansible/tmp/ansible-tmp-1726883026.3734968-30964-103159213989419 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883026.40347: variable 'ansible_module_compression' from source: unknown 28983 1726883026.40351: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 28983 1726883026.40375: variable 'ansible_facts' from source: unknown 28983 1726883026.40498: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726883026.3734968-30964-103159213989419/AnsiballZ_stat.py 28983 1726883026.40713: Sending initial data 28983 1726883026.40723: Sent initial data (153 bytes) 28983 1726883026.41415: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883026.41462: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883026.41502: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883026.41556: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883026.41649: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883026.41690: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883026.41728: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883026.41827: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883026.43478: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726883026.43626: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726883026.43684: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpmnovoo8y /root/.ansible/tmp/ansible-tmp-1726883026.3734968-30964-103159213989419/AnsiballZ_stat.py <<< 28983 1726883026.43688: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726883026.3734968-30964-103159213989419/AnsiballZ_stat.py" <<< 28983 1726883026.43797: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpmnovoo8y" to remote "/root/.ansible/tmp/ansible-tmp-1726883026.3734968-30964-103159213989419/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726883026.3734968-30964-103159213989419/AnsiballZ_stat.py" <<< 28983 1726883026.44933: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883026.44966: stderr chunk (state=3): >>><<< 28983 1726883026.44969: stdout chunk (state=3): >>><<< 28983 1726883026.44991: done transferring module to remote 28983 1726883026.45001: _low_level_execute_command(): starting 28983 1726883026.45006: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726883026.3734968-30964-103159213989419/ /root/.ansible/tmp/ansible-tmp-1726883026.3734968-30964-103159213989419/AnsiballZ_stat.py && sleep 0' 28983 1726883026.45443: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883026.45448: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883026.45450: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883026.45453: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found <<< 28983 1726883026.45455: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883026.45508: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883026.45512: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883026.45588: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883026.47637: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883026.47641: stdout chunk (state=3): >>><<< 28983 1726883026.47644: stderr chunk (state=3): >>><<< 28983 1726883026.47647: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883026.47649: _low_level_execute_command(): starting 28983 1726883026.47656: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726883026.3734968-30964-103159213989419/AnsiballZ_stat.py && sleep 0' 28983 1726883026.48108: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883026.48115: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883026.48122: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726883026.48142: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883026.48152: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883026.48160: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found <<< 28983 1726883026.48173: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883026.48240: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883026.48258: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883026.48327: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883026.65673: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/statebr", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 38149, "dev": 23, "nlink": 1, "atime": 1726883015.162544, "mtime": 1726883015.162544, "ctime": 1726883015.162544, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/statebr", "lnk_target": "../../devices/virtual/net/statebr", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 28983 1726883026.67142: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. <<< 28983 1726883026.67146: stdout chunk (state=3): >>><<< 28983 1726883026.67148: stderr chunk (state=3): >>><<< 28983 1726883026.67154: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/statebr", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 38149, "dev": 23, "nlink": 1, "atime": 1726883015.162544, "mtime": 1726883015.162544, "ctime": 1726883015.162544, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/statebr", "lnk_target": "../../devices/virtual/net/statebr", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726883026.67157: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726883026.3734968-30964-103159213989419/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726883026.67159: _low_level_execute_command(): starting 28983 1726883026.67431: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726883026.3734968-30964-103159213989419/ > /dev/null 2>&1 && sleep 0' 28983 1726883026.68395: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883026.68640: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883026.68738: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726883026.68742: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883026.68755: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883026.68803: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883026.68949: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883026.71151: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883026.71155: stderr chunk (state=3): >>><<< 28983 1726883026.71161: stdout chunk (state=3): >>><<< 28983 1726883026.71189: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883026.71316: handler run complete 28983 1726883026.71394: attempt loop complete, returning result 28983 1726883026.71398: _execute() done 28983 1726883026.71401: dumping result to json 28983 1726883026.71411: done dumping result, returning 28983 1726883026.71539: done running TaskExecutor() for managed_node2/TASK: Get stat for interface statebr [0affe814-3a2d-b16d-c0a7-000000000ef5] 28983 1726883026.71553: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000ef5 28983 1726883026.71703: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000ef5 28983 1726883026.71708: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "atime": 1726883015.162544, "block_size": 4096, "blocks": 0, "ctime": 1726883015.162544, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 38149, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/statebr", "lnk_target": "../../devices/virtual/net/statebr", "mode": "0777", "mtime": 1726883015.162544, "nlink": 1, "path": "/sys/class/net/statebr", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 28983 1726883026.71847: no more pending results, returning what we have 28983 1726883026.71852: results queue empty 28983 1726883026.71853: checking for any_errors_fatal 28983 1726883026.71855: done checking for any_errors_fatal 28983 1726883026.71856: checking for max_fail_percentage 28983 1726883026.71859: done checking for max_fail_percentage 28983 1726883026.71859: checking to see if all hosts have failed and the running result is not ok 28983 1726883026.71861: done checking to see if all hosts have failed 28983 1726883026.71862: getting the remaining hosts for this loop 28983 1726883026.71864: done getting the remaining hosts for this loop 28983 1726883026.71870: getting the next task for host managed_node2 28983 1726883026.71884: done getting next task for host managed_node2 28983 1726883026.71889: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 28983 1726883026.71893: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883026.71903: getting variables 28983 1726883026.71904: in VariableManager get_vars() 28983 1726883026.72374: Calling all_inventory to load vars for managed_node2 28983 1726883026.72378: Calling groups_inventory to load vars for managed_node2 28983 1726883026.72383: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883026.72395: Calling all_plugins_play to load vars for managed_node2 28983 1726883026.72399: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883026.72403: Calling groups_plugins_play to load vars for managed_node2 28983 1726883026.77725: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883026.81829: done with get_vars() 28983 1726883026.81867: done getting variables 28983 1726883026.81953: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 28983 1726883026.82316: variable 'interface' from source: play vars TASK [Assert that the interface is present - 'statebr'] ************************ task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 21:43:46 -0400 (0:00:00.497) 0:00:56.822 ****** 28983 1726883026.82450: entering _queue_task() for managed_node2/assert 28983 1726883026.83560: worker is 1 (out of 1 available) 28983 1726883026.83573: exiting _queue_task() for managed_node2/assert 28983 1726883026.83584: done queuing things up, now waiting for results queue to drain 28983 1726883026.83586: waiting for pending results... 28983 1726883026.84123: running TaskExecutor() for managed_node2/TASK: Assert that the interface is present - 'statebr' 28983 1726883026.84478: in run() - task 0affe814-3a2d-b16d-c0a7-000000000e87 28983 1726883026.84483: variable 'ansible_search_path' from source: unknown 28983 1726883026.84486: variable 'ansible_search_path' from source: unknown 28983 1726883026.84533: calling self._execute() 28983 1726883026.84815: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883026.84826: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883026.84881: variable 'omit' from source: magic vars 28983 1726883026.85510: variable 'ansible_distribution_major_version' from source: facts 28983 1726883026.85530: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883026.85537: variable 'omit' from source: magic vars 28983 1726883026.85612: variable 'omit' from source: magic vars 28983 1726883026.85746: variable 'interface' from source: play vars 28983 1726883026.85767: variable 'omit' from source: magic vars 28983 1726883026.85827: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883026.85907: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883026.86016: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883026.86020: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883026.86022: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883026.86024: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883026.86027: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883026.86029: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883026.86157: Set connection var ansible_connection to ssh 28983 1726883026.86174: Set connection var ansible_shell_executable to /bin/sh 28983 1726883026.86188: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883026.86199: Set connection var ansible_timeout to 10 28983 1726883026.86206: Set connection var ansible_pipelining to False 28983 1726883026.86210: Set connection var ansible_shell_type to sh 28983 1726883026.86241: variable 'ansible_shell_executable' from source: unknown 28983 1726883026.86245: variable 'ansible_connection' from source: unknown 28983 1726883026.86248: variable 'ansible_module_compression' from source: unknown 28983 1726883026.86255: variable 'ansible_shell_type' from source: unknown 28983 1726883026.86263: variable 'ansible_shell_executable' from source: unknown 28983 1726883026.86442: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883026.86446: variable 'ansible_pipelining' from source: unknown 28983 1726883026.86449: variable 'ansible_timeout' from source: unknown 28983 1726883026.86453: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883026.86478: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883026.86500: variable 'omit' from source: magic vars 28983 1726883026.86513: starting attempt loop 28983 1726883026.86517: running the handler 28983 1726883026.86943: variable 'interface_stat' from source: set_fact 28983 1726883026.87002: Evaluated conditional (interface_stat.stat.exists): True 28983 1726883026.87009: handler run complete 28983 1726883026.87029: attempt loop complete, returning result 28983 1726883026.87241: _execute() done 28983 1726883026.87245: dumping result to json 28983 1726883026.87247: done dumping result, returning 28983 1726883026.87249: done running TaskExecutor() for managed_node2/TASK: Assert that the interface is present - 'statebr' [0affe814-3a2d-b16d-c0a7-000000000e87] 28983 1726883026.87252: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000e87 28983 1726883026.87329: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000e87 28983 1726883026.87332: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 28983 1726883026.87399: no more pending results, returning what we have 28983 1726883026.87403: results queue empty 28983 1726883026.87404: checking for any_errors_fatal 28983 1726883026.87418: done checking for any_errors_fatal 28983 1726883026.87419: checking for max_fail_percentage 28983 1726883026.87422: done checking for max_fail_percentage 28983 1726883026.87423: checking to see if all hosts have failed and the running result is not ok 28983 1726883026.87424: done checking to see if all hosts have failed 28983 1726883026.87425: getting the remaining hosts for this loop 28983 1726883026.87428: done getting the remaining hosts for this loop 28983 1726883026.87433: getting the next task for host managed_node2 28983 1726883026.87569: done getting next task for host managed_node2 28983 1726883026.87573: ^ task is: TASK: Include the task 'get_profile_stat.yml' 28983 1726883026.87579: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883026.87585: getting variables 28983 1726883026.87587: in VariableManager get_vars() 28983 1726883026.87630: Calling all_inventory to load vars for managed_node2 28983 1726883026.87840: Calling groups_inventory to load vars for managed_node2 28983 1726883026.87846: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883026.87858: Calling all_plugins_play to load vars for managed_node2 28983 1726883026.87863: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883026.87867: Calling groups_plugins_play to load vars for managed_node2 28983 1726883026.90900: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883026.94236: done with get_vars() 28983 1726883026.94283: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Friday 20 September 2024 21:43:46 -0400 (0:00:00.119) 0:00:56.941 ****** 28983 1726883026.94407: entering _queue_task() for managed_node2/include_tasks 28983 1726883026.94927: worker is 1 (out of 1 available) 28983 1726883026.94941: exiting _queue_task() for managed_node2/include_tasks 28983 1726883026.94952: done queuing things up, now waiting for results queue to drain 28983 1726883026.94954: waiting for pending results... 28983 1726883026.95326: running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' 28983 1726883026.95331: in run() - task 0affe814-3a2d-b16d-c0a7-000000000e8b 28983 1726883026.95358: variable 'ansible_search_path' from source: unknown 28983 1726883026.95363: variable 'ansible_search_path' from source: unknown 28983 1726883026.95366: calling self._execute() 28983 1726883026.95831: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883026.95837: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883026.95840: variable 'omit' from source: magic vars 28983 1726883026.96200: variable 'ansible_distribution_major_version' from source: facts 28983 1726883026.96268: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883026.96274: _execute() done 28983 1726883026.96277: dumping result to json 28983 1726883026.96279: done dumping result, returning 28983 1726883026.96281: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' [0affe814-3a2d-b16d-c0a7-000000000e8b] 28983 1726883026.96284: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000e8b 28983 1726883026.96485: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000e8b 28983 1726883026.96489: WORKER PROCESS EXITING 28983 1726883026.96524: no more pending results, returning what we have 28983 1726883026.96530: in VariableManager get_vars() 28983 1726883026.96577: Calling all_inventory to load vars for managed_node2 28983 1726883026.96638: Calling groups_inventory to load vars for managed_node2 28983 1726883026.96644: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883026.96658: Calling all_plugins_play to load vars for managed_node2 28983 1726883026.96662: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883026.96666: Calling groups_plugins_play to load vars for managed_node2 28983 1726883027.01771: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883027.05999: done with get_vars() 28983 1726883027.06039: variable 'ansible_search_path' from source: unknown 28983 1726883027.06040: variable 'ansible_search_path' from source: unknown 28983 1726883027.06168: variable 'item' from source: include params 28983 1726883027.06300: variable 'item' from source: include params 28983 1726883027.06554: we have included files to process 28983 1726883027.06555: generating all_blocks data 28983 1726883027.06558: done generating all_blocks data 28983 1726883027.06566: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 28983 1726883027.06567: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 28983 1726883027.06571: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 28983 1726883027.08870: done processing included file 28983 1726883027.08872: iterating over new_blocks loaded from include file 28983 1726883027.08874: in VariableManager get_vars() 28983 1726883027.08907: done with get_vars() 28983 1726883027.08910: filtering new block on tags 28983 1726883027.09023: done filtering new block on tags 28983 1726883027.09027: in VariableManager get_vars() 28983 1726883027.09049: done with get_vars() 28983 1726883027.09052: filtering new block on tags 28983 1726883027.09145: done filtering new block on tags 28983 1726883027.09148: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node2 28983 1726883027.09154: extending task lists for all hosts with included blocks 28983 1726883027.09689: done extending task lists 28983 1726883027.09690: done processing included files 28983 1726883027.09692: results queue empty 28983 1726883027.09693: checking for any_errors_fatal 28983 1726883027.09697: done checking for any_errors_fatal 28983 1726883027.09698: checking for max_fail_percentage 28983 1726883027.09699: done checking for max_fail_percentage 28983 1726883027.09700: checking to see if all hosts have failed and the running result is not ok 28983 1726883027.09701: done checking to see if all hosts have failed 28983 1726883027.09702: getting the remaining hosts for this loop 28983 1726883027.09704: done getting the remaining hosts for this loop 28983 1726883027.09707: getting the next task for host managed_node2 28983 1726883027.09713: done getting next task for host managed_node2 28983 1726883027.09716: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 28983 1726883027.09720: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883027.09723: getting variables 28983 1726883027.09724: in VariableManager get_vars() 28983 1726883027.09737: Calling all_inventory to load vars for managed_node2 28983 1726883027.09740: Calling groups_inventory to load vars for managed_node2 28983 1726883027.09744: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883027.09750: Calling all_plugins_play to load vars for managed_node2 28983 1726883027.09754: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883027.09757: Calling groups_plugins_play to load vars for managed_node2 28983 1726883027.12963: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883027.17521: done with get_vars() 28983 1726883027.17706: done getting variables 28983 1726883027.17762: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 21:43:47 -0400 (0:00:00.234) 0:00:57.176 ****** 28983 1726883027.17850: entering _queue_task() for managed_node2/set_fact 28983 1726883027.18389: worker is 1 (out of 1 available) 28983 1726883027.18401: exiting _queue_task() for managed_node2/set_fact 28983 1726883027.18413: done queuing things up, now waiting for results queue to drain 28983 1726883027.18415: waiting for pending results... 28983 1726883027.18940: running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag 28983 1726883027.19440: in run() - task 0affe814-3a2d-b16d-c0a7-000000000f13 28983 1726883027.19444: variable 'ansible_search_path' from source: unknown 28983 1726883027.19448: variable 'ansible_search_path' from source: unknown 28983 1726883027.19451: calling self._execute() 28983 1726883027.19663: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883027.19682: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883027.19702: variable 'omit' from source: magic vars 28983 1726883027.20546: variable 'ansible_distribution_major_version' from source: facts 28983 1726883027.20839: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883027.20843: variable 'omit' from source: magic vars 28983 1726883027.20846: variable 'omit' from source: magic vars 28983 1726883027.20892: variable 'omit' from source: magic vars 28983 1726883027.21242: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883027.21246: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883027.21249: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883027.21256: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883027.21277: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883027.21323: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883027.21336: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883027.21348: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883027.21620: Set connection var ansible_connection to ssh 28983 1726883027.21644: Set connection var ansible_shell_executable to /bin/sh 28983 1726883027.21697: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883027.21714: Set connection var ansible_timeout to 10 28983 1726883027.21898: Set connection var ansible_pipelining to False 28983 1726883027.21902: Set connection var ansible_shell_type to sh 28983 1726883027.21905: variable 'ansible_shell_executable' from source: unknown 28983 1726883027.21907: variable 'ansible_connection' from source: unknown 28983 1726883027.21910: variable 'ansible_module_compression' from source: unknown 28983 1726883027.21912: variable 'ansible_shell_type' from source: unknown 28983 1726883027.21914: variable 'ansible_shell_executable' from source: unknown 28983 1726883027.21916: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883027.21918: variable 'ansible_pipelining' from source: unknown 28983 1726883027.21920: variable 'ansible_timeout' from source: unknown 28983 1726883027.21923: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883027.22265: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883027.22355: variable 'omit' from source: magic vars 28983 1726883027.22367: starting attempt loop 28983 1726883027.22379: running the handler 28983 1726883027.22400: handler run complete 28983 1726883027.22455: attempt loop complete, returning result 28983 1726883027.22462: _execute() done 28983 1726883027.22470: dumping result to json 28983 1726883027.22482: done dumping result, returning 28983 1726883027.22639: done running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag [0affe814-3a2d-b16d-c0a7-000000000f13] 28983 1726883027.22643: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000f13 28983 1726883027.22837: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000f13 28983 1726883027.22841: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 28983 1726883027.22902: no more pending results, returning what we have 28983 1726883027.22906: results queue empty 28983 1726883027.22907: checking for any_errors_fatal 28983 1726883027.22909: done checking for any_errors_fatal 28983 1726883027.22910: checking for max_fail_percentage 28983 1726883027.22912: done checking for max_fail_percentage 28983 1726883027.22913: checking to see if all hosts have failed and the running result is not ok 28983 1726883027.22914: done checking to see if all hosts have failed 28983 1726883027.22915: getting the remaining hosts for this loop 28983 1726883027.22917: done getting the remaining hosts for this loop 28983 1726883027.22921: getting the next task for host managed_node2 28983 1726883027.22932: done getting next task for host managed_node2 28983 1726883027.22936: ^ task is: TASK: Stat profile file 28983 1726883027.22943: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883027.22947: getting variables 28983 1726883027.22949: in VariableManager get_vars() 28983 1726883027.22986: Calling all_inventory to load vars for managed_node2 28983 1726883027.22990: Calling groups_inventory to load vars for managed_node2 28983 1726883027.22994: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883027.23006: Calling all_plugins_play to load vars for managed_node2 28983 1726883027.23010: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883027.23014: Calling groups_plugins_play to load vars for managed_node2 28983 1726883027.25889: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883027.32368: done with get_vars() 28983 1726883027.32415: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 21:43:47 -0400 (0:00:00.146) 0:00:57.323 ****** 28983 1726883027.32675: entering _queue_task() for managed_node2/stat 28983 1726883027.33601: worker is 1 (out of 1 available) 28983 1726883027.33613: exiting _queue_task() for managed_node2/stat 28983 1726883027.33624: done queuing things up, now waiting for results queue to drain 28983 1726883027.33626: waiting for pending results... 28983 1726883027.33957: running TaskExecutor() for managed_node2/TASK: Stat profile file 28983 1726883027.34342: in run() - task 0affe814-3a2d-b16d-c0a7-000000000f14 28983 1726883027.34354: variable 'ansible_search_path' from source: unknown 28983 1726883027.34359: variable 'ansible_search_path' from source: unknown 28983 1726883027.34363: calling self._execute() 28983 1726883027.34685: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883027.34845: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883027.34849: variable 'omit' from source: magic vars 28983 1726883027.35901: variable 'ansible_distribution_major_version' from source: facts 28983 1726883027.35920: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883027.35936: variable 'omit' from source: magic vars 28983 1726883027.36039: variable 'omit' from source: magic vars 28983 1726883027.36195: variable 'profile' from source: play vars 28983 1726883027.36211: variable 'interface' from source: play vars 28983 1726883027.36313: variable 'interface' from source: play vars 28983 1726883027.36343: variable 'omit' from source: magic vars 28983 1726883027.36409: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883027.36490: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883027.36516: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883027.36553: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883027.36598: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883027.36625: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883027.36644: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883027.36656: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883027.36817: Set connection var ansible_connection to ssh 28983 1726883027.36838: Set connection var ansible_shell_executable to /bin/sh 28983 1726883027.36926: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883027.36935: Set connection var ansible_timeout to 10 28983 1726883027.36939: Set connection var ansible_pipelining to False 28983 1726883027.36945: Set connection var ansible_shell_type to sh 28983 1726883027.36949: variable 'ansible_shell_executable' from source: unknown 28983 1726883027.36957: variable 'ansible_connection' from source: unknown 28983 1726883027.36970: variable 'ansible_module_compression' from source: unknown 28983 1726883027.36982: variable 'ansible_shell_type' from source: unknown 28983 1726883027.36989: variable 'ansible_shell_executable' from source: unknown 28983 1726883027.36997: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883027.37006: variable 'ansible_pipelining' from source: unknown 28983 1726883027.37014: variable 'ansible_timeout' from source: unknown 28983 1726883027.37034: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883027.37318: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28983 1726883027.37360: variable 'omit' from source: magic vars 28983 1726883027.37364: starting attempt loop 28983 1726883027.37374: running the handler 28983 1726883027.37405: _low_level_execute_command(): starting 28983 1726883027.37408: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726883027.38420: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883027.38594: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883027.38654: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883027.38727: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883027.40489: stdout chunk (state=3): >>>/root <<< 28983 1726883027.40684: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883027.40688: stdout chunk (state=3): >>><<< 28983 1726883027.40690: stderr chunk (state=3): >>><<< 28983 1726883027.40714: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883027.40824: _low_level_execute_command(): starting 28983 1726883027.40828: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726883027.40722-31003-253691847352647 `" && echo ansible-tmp-1726883027.40722-31003-253691847352647="` echo /root/.ansible/tmp/ansible-tmp-1726883027.40722-31003-253691847352647 `" ) && sleep 0' 28983 1726883027.41377: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883027.41398: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883027.41412: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883027.41433: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883027.41453: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726883027.41509: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883027.41591: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883027.41623: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883027.41657: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883027.41740: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883027.43823: stdout chunk (state=3): >>>ansible-tmp-1726883027.40722-31003-253691847352647=/root/.ansible/tmp/ansible-tmp-1726883027.40722-31003-253691847352647 <<< 28983 1726883027.44023: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883027.44051: stderr chunk (state=3): >>><<< 28983 1726883027.44054: stdout chunk (state=3): >>><<< 28983 1726883027.44349: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726883027.40722-31003-253691847352647=/root/.ansible/tmp/ansible-tmp-1726883027.40722-31003-253691847352647 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883027.44353: variable 'ansible_module_compression' from source: unknown 28983 1726883027.44356: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 28983 1726883027.44358: variable 'ansible_facts' from source: unknown 28983 1726883027.44639: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726883027.40722-31003-253691847352647/AnsiballZ_stat.py 28983 1726883027.45190: Sending initial data 28983 1726883027.45224: Sent initial data (151 bytes) 28983 1726883027.46210: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883027.46224: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883027.46256: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883027.46420: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883027.46433: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883027.46538: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883027.46579: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883027.48237: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 28983 1726883027.48260: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726883027.48349: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726883027.48446: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpct_sgz11 /root/.ansible/tmp/ansible-tmp-1726883027.40722-31003-253691847352647/AnsiballZ_stat.py <<< 28983 1726883027.48462: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726883027.40722-31003-253691847352647/AnsiballZ_stat.py" <<< 28983 1726883027.48514: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpct_sgz11" to remote "/root/.ansible/tmp/ansible-tmp-1726883027.40722-31003-253691847352647/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726883027.40722-31003-253691847352647/AnsiballZ_stat.py" <<< 28983 1726883027.49865: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883027.49945: stderr chunk (state=3): >>><<< 28983 1726883027.49949: stdout chunk (state=3): >>><<< 28983 1726883027.49976: done transferring module to remote 28983 1726883027.50055: _low_level_execute_command(): starting 28983 1726883027.50064: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726883027.40722-31003-253691847352647/ /root/.ansible/tmp/ansible-tmp-1726883027.40722-31003-253691847352647/AnsiballZ_stat.py && sleep 0' 28983 1726883027.51111: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883027.51157: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883027.51167: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883027.51243: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883027.51264: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883027.51383: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883027.53626: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883027.53630: stdout chunk (state=3): >>><<< 28983 1726883027.53638: stderr chunk (state=3): >>><<< 28983 1726883027.53641: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883027.53644: _low_level_execute_command(): starting 28983 1726883027.53646: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726883027.40722-31003-253691847352647/AnsiballZ_stat.py && sleep 0' 28983 1726883027.54699: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883027.54715: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883027.54732: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883027.54847: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883027.55269: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883027.55344: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883027.73240: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 28983 1726883027.74810: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. <<< 28983 1726883027.74822: stdout chunk (state=3): >>><<< 28983 1726883027.74839: stderr chunk (state=3): >>><<< 28983 1726883027.74875: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726883027.74924: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726883027.40722-31003-253691847352647/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726883027.74947: _low_level_execute_command(): starting 28983 1726883027.74960: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726883027.40722-31003-253691847352647/ > /dev/null 2>&1 && sleep 0' 28983 1726883027.75565: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883027.75584: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883027.75600: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883027.75620: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883027.75642: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726883027.75657: stderr chunk (state=3): >>>debug2: match not found <<< 28983 1726883027.75677: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883027.75755: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883027.75797: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883027.75814: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883027.75841: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883027.75945: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883027.78088: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883027.78113: stderr chunk (state=3): >>><<< 28983 1726883027.78148: stdout chunk (state=3): >>><<< 28983 1726883027.78300: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883027.78304: handler run complete 28983 1726883027.78307: attempt loop complete, returning result 28983 1726883027.78309: _execute() done 28983 1726883027.78311: dumping result to json 28983 1726883027.78638: done dumping result, returning 28983 1726883027.78642: done running TaskExecutor() for managed_node2/TASK: Stat profile file [0affe814-3a2d-b16d-c0a7-000000000f14] 28983 1726883027.78645: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000f14 28983 1726883027.78840: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000f14 28983 1726883027.78846: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 28983 1726883027.78929: no more pending results, returning what we have 28983 1726883027.78933: results queue empty 28983 1726883027.78936: checking for any_errors_fatal 28983 1726883027.78944: done checking for any_errors_fatal 28983 1726883027.78945: checking for max_fail_percentage 28983 1726883027.78948: done checking for max_fail_percentage 28983 1726883027.78949: checking to see if all hosts have failed and the running result is not ok 28983 1726883027.78950: done checking to see if all hosts have failed 28983 1726883027.78951: getting the remaining hosts for this loop 28983 1726883027.78954: done getting the remaining hosts for this loop 28983 1726883027.78961: getting the next task for host managed_node2 28983 1726883027.78973: done getting next task for host managed_node2 28983 1726883027.78977: ^ task is: TASK: Set NM profile exist flag based on the profile files 28983 1726883027.78984: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883027.78988: getting variables 28983 1726883027.78990: in VariableManager get_vars() 28983 1726883027.79032: Calling all_inventory to load vars for managed_node2 28983 1726883027.79240: Calling groups_inventory to load vars for managed_node2 28983 1726883027.79245: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883027.79267: Calling all_plugins_play to load vars for managed_node2 28983 1726883027.79274: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883027.79278: Calling groups_plugins_play to load vars for managed_node2 28983 1726883027.82049: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883027.84607: done with get_vars() 28983 1726883027.84638: done getting variables 28983 1726883027.84732: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 21:43:47 -0400 (0:00:00.522) 0:00:57.845 ****** 28983 1726883027.84789: entering _queue_task() for managed_node2/set_fact 28983 1726883027.85156: worker is 1 (out of 1 available) 28983 1726883027.85174: exiting _queue_task() for managed_node2/set_fact 28983 1726883027.85186: done queuing things up, now waiting for results queue to drain 28983 1726883027.85188: waiting for pending results... 28983 1726883027.85523: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files 28983 1726883027.85717: in run() - task 0affe814-3a2d-b16d-c0a7-000000000f15 28983 1726883027.85769: variable 'ansible_search_path' from source: unknown 28983 1726883027.85777: variable 'ansible_search_path' from source: unknown 28983 1726883027.85811: calling self._execute() 28983 1726883027.85986: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883027.85990: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883027.85993: variable 'omit' from source: magic vars 28983 1726883027.86490: variable 'ansible_distribution_major_version' from source: facts 28983 1726883027.86508: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883027.86709: variable 'profile_stat' from source: set_fact 28983 1726883027.86751: Evaluated conditional (profile_stat.stat.exists): False 28983 1726883027.86758: when evaluation is False, skipping this task 28983 1726883027.86762: _execute() done 28983 1726883027.86838: dumping result to json 28983 1726883027.86842: done dumping result, returning 28983 1726883027.86845: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files [0affe814-3a2d-b16d-c0a7-000000000f15] 28983 1726883027.86847: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000f15 28983 1726883027.86931: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000f15 28983 1726883027.86936: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 28983 1726883027.86996: no more pending results, returning what we have 28983 1726883027.87001: results queue empty 28983 1726883027.87002: checking for any_errors_fatal 28983 1726883027.87020: done checking for any_errors_fatal 28983 1726883027.87021: checking for max_fail_percentage 28983 1726883027.87027: done checking for max_fail_percentage 28983 1726883027.87028: checking to see if all hosts have failed and the running result is not ok 28983 1726883027.87029: done checking to see if all hosts have failed 28983 1726883027.87030: getting the remaining hosts for this loop 28983 1726883027.87032: done getting the remaining hosts for this loop 28983 1726883027.87040: getting the next task for host managed_node2 28983 1726883027.87050: done getting next task for host managed_node2 28983 1726883027.87056: ^ task is: TASK: Get NM profile info 28983 1726883027.87063: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883027.87069: getting variables 28983 1726883027.87070: in VariableManager get_vars() 28983 1726883027.87117: Calling all_inventory to load vars for managed_node2 28983 1726883027.87120: Calling groups_inventory to load vars for managed_node2 28983 1726883027.87124: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883027.87345: Calling all_plugins_play to load vars for managed_node2 28983 1726883027.87350: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883027.87355: Calling groups_plugins_play to load vars for managed_node2 28983 1726883027.89615: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883027.91922: done with get_vars() 28983 1726883027.91946: done getting variables 28983 1726883027.91993: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 21:43:47 -0400 (0:00:00.072) 0:00:57.918 ****** 28983 1726883027.92023: entering _queue_task() for managed_node2/shell 28983 1726883027.92245: worker is 1 (out of 1 available) 28983 1726883027.92259: exiting _queue_task() for managed_node2/shell 28983 1726883027.92273: done queuing things up, now waiting for results queue to drain 28983 1726883027.92275: waiting for pending results... 28983 1726883027.92458: running TaskExecutor() for managed_node2/TASK: Get NM profile info 28983 1726883027.92554: in run() - task 0affe814-3a2d-b16d-c0a7-000000000f16 28983 1726883027.92567: variable 'ansible_search_path' from source: unknown 28983 1726883027.92573: variable 'ansible_search_path' from source: unknown 28983 1726883027.92601: calling self._execute() 28983 1726883027.92683: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883027.92687: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883027.92699: variable 'omit' from source: magic vars 28983 1726883027.93011: variable 'ansible_distribution_major_version' from source: facts 28983 1726883027.93021: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883027.93026: variable 'omit' from source: magic vars 28983 1726883027.93083: variable 'omit' from source: magic vars 28983 1726883027.93167: variable 'profile' from source: play vars 28983 1726883027.93176: variable 'interface' from source: play vars 28983 1726883027.93245: variable 'interface' from source: play vars 28983 1726883027.93262: variable 'omit' from source: magic vars 28983 1726883027.93305: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883027.93337: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883027.93354: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883027.93374: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883027.93384: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883027.93415: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883027.93419: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883027.93424: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883027.93506: Set connection var ansible_connection to ssh 28983 1726883027.93518: Set connection var ansible_shell_executable to /bin/sh 28983 1726883027.93527: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883027.93536: Set connection var ansible_timeout to 10 28983 1726883027.93543: Set connection var ansible_pipelining to False 28983 1726883027.93547: Set connection var ansible_shell_type to sh 28983 1726883027.93564: variable 'ansible_shell_executable' from source: unknown 28983 1726883027.93567: variable 'ansible_connection' from source: unknown 28983 1726883027.93574: variable 'ansible_module_compression' from source: unknown 28983 1726883027.93576: variable 'ansible_shell_type' from source: unknown 28983 1726883027.93579: variable 'ansible_shell_executable' from source: unknown 28983 1726883027.93581: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883027.93586: variable 'ansible_pipelining' from source: unknown 28983 1726883027.93589: variable 'ansible_timeout' from source: unknown 28983 1726883027.93599: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883027.93712: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883027.93724: variable 'omit' from source: magic vars 28983 1726883027.93729: starting attempt loop 28983 1726883027.93739: running the handler 28983 1726883027.93747: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883027.93766: _low_level_execute_command(): starting 28983 1726883027.93775: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726883027.94290: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883027.94294: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883027.94298: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883027.94349: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883027.94354: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883027.94436: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883027.96508: stdout chunk (state=3): >>>/root <<< 28983 1726883027.96663: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883027.96740: stderr chunk (state=3): >>><<< 28983 1726883027.96752: stdout chunk (state=3): >>><<< 28983 1726883027.96765: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883027.96798: _low_level_execute_command(): starting 28983 1726883027.96802: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726883027.9676476-31027-214901178672205 `" && echo ansible-tmp-1726883027.9676476-31027-214901178672205="` echo /root/.ansible/tmp/ansible-tmp-1726883027.9676476-31027-214901178672205 `" ) && sleep 0' 28983 1726883027.97406: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883027.97416: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883027.97476: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883027.97479: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883027.97559: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883027.99864: stdout chunk (state=3): >>>ansible-tmp-1726883027.9676476-31027-214901178672205=/root/.ansible/tmp/ansible-tmp-1726883027.9676476-31027-214901178672205 <<< 28983 1726883027.99990: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883028.00033: stderr chunk (state=3): >>><<< 28983 1726883028.00037: stdout chunk (state=3): >>><<< 28983 1726883028.00054: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726883027.9676476-31027-214901178672205=/root/.ansible/tmp/ansible-tmp-1726883027.9676476-31027-214901178672205 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883028.00080: variable 'ansible_module_compression' from source: unknown 28983 1726883028.00121: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 28983 1726883028.00150: variable 'ansible_facts' from source: unknown 28983 1726883028.00211: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726883027.9676476-31027-214901178672205/AnsiballZ_command.py 28983 1726883028.00512: Sending initial data 28983 1726883028.00516: Sent initial data (156 bytes) 28983 1726883028.00936: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883028.00940: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883028.00943: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883028.00945: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883028.00999: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883028.01004: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883028.01079: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883028.02752: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 28983 1726883028.02757: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726883028.02818: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726883028.02895: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpxhz8_25p /root/.ansible/tmp/ansible-tmp-1726883027.9676476-31027-214901178672205/AnsiballZ_command.py <<< 28983 1726883028.02899: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726883027.9676476-31027-214901178672205/AnsiballZ_command.py" <<< 28983 1726883028.02957: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpxhz8_25p" to remote "/root/.ansible/tmp/ansible-tmp-1726883027.9676476-31027-214901178672205/AnsiballZ_command.py" <<< 28983 1726883028.02965: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726883027.9676476-31027-214901178672205/AnsiballZ_command.py" <<< 28983 1726883028.03859: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883028.03923: stderr chunk (state=3): >>><<< 28983 1726883028.03927: stdout chunk (state=3): >>><<< 28983 1726883028.03966: done transferring module to remote 28983 1726883028.03974: _low_level_execute_command(): starting 28983 1726883028.03977: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726883027.9676476-31027-214901178672205/ /root/.ansible/tmp/ansible-tmp-1726883027.9676476-31027-214901178672205/AnsiballZ_command.py && sleep 0' 28983 1726883028.04503: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883028.04507: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726883028.04510: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883028.04512: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883028.04517: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883028.04581: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883028.04593: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883028.04674: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883028.06574: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883028.06611: stderr chunk (state=3): >>><<< 28983 1726883028.06619: stdout chunk (state=3): >>><<< 28983 1726883028.06637: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883028.06642: _low_level_execute_command(): starting 28983 1726883028.06644: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726883027.9676476-31027-214901178672205/AnsiballZ_command.py && sleep 0' 28983 1726883028.07031: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883028.07077: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726883028.07080: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883028.07083: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883028.07085: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883028.07127: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883028.07135: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883028.07207: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883028.26467: stdout chunk (state=3): >>> {"changed": true, "stdout": "statebr /etc/NetworkManager/system-connections/statebr.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "start": "2024-09-20 21:43:48.244756", "end": "2024-09-20 21:43:48.263532", "delta": "0:00:00.018776", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 28983 1726883028.28154: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. <<< 28983 1726883028.28210: stderr chunk (state=3): >>><<< 28983 1726883028.28213: stdout chunk (state=3): >>><<< 28983 1726883028.28236: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "statebr /etc/NetworkManager/system-connections/statebr.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "start": "2024-09-20 21:43:48.244756", "end": "2024-09-20 21:43:48.263532", "delta": "0:00:00.018776", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726883028.28276: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726883027.9676476-31027-214901178672205/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726883028.28286: _low_level_execute_command(): starting 28983 1726883028.28292: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726883027.9676476-31027-214901178672205/ > /dev/null 2>&1 && sleep 0' 28983 1726883028.28866: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883028.28872: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883028.28876: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883028.28882: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883028.28905: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883028.28962: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883028.29092: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883028.31030: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883028.31062: stderr chunk (state=3): >>><<< 28983 1726883028.31065: stdout chunk (state=3): >>><<< 28983 1726883028.31080: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883028.31087: handler run complete 28983 1726883028.31110: Evaluated conditional (False): False 28983 1726883028.31121: attempt loop complete, returning result 28983 1726883028.31124: _execute() done 28983 1726883028.31129: dumping result to json 28983 1726883028.31139: done dumping result, returning 28983 1726883028.31145: done running TaskExecutor() for managed_node2/TASK: Get NM profile info [0affe814-3a2d-b16d-c0a7-000000000f16] 28983 1726883028.31154: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000f16 28983 1726883028.31265: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000f16 28983 1726883028.31269: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "delta": "0:00:00.018776", "end": "2024-09-20 21:43:48.263532", "rc": 0, "start": "2024-09-20 21:43:48.244756" } STDOUT: statebr /etc/NetworkManager/system-connections/statebr.nmconnection 28983 1726883028.31359: no more pending results, returning what we have 28983 1726883028.31363: results queue empty 28983 1726883028.31364: checking for any_errors_fatal 28983 1726883028.31373: done checking for any_errors_fatal 28983 1726883028.31374: checking for max_fail_percentage 28983 1726883028.31376: done checking for max_fail_percentage 28983 1726883028.31377: checking to see if all hosts have failed and the running result is not ok 28983 1726883028.31378: done checking to see if all hosts have failed 28983 1726883028.31381: getting the remaining hosts for this loop 28983 1726883028.31383: done getting the remaining hosts for this loop 28983 1726883028.31388: getting the next task for host managed_node2 28983 1726883028.31395: done getting next task for host managed_node2 28983 1726883028.31398: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 28983 1726883028.31404: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883028.31409: getting variables 28983 1726883028.31410: in VariableManager get_vars() 28983 1726883028.31456: Calling all_inventory to load vars for managed_node2 28983 1726883028.31459: Calling groups_inventory to load vars for managed_node2 28983 1726883028.31463: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883028.31474: Calling all_plugins_play to load vars for managed_node2 28983 1726883028.31477: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883028.31481: Calling groups_plugins_play to load vars for managed_node2 28983 1726883028.36588: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883028.44335: done with get_vars() 28983 1726883028.44809: done getting variables 28983 1726883028.44889: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 21:43:48 -0400 (0:00:00.529) 0:00:58.447 ****** 28983 1726883028.44974: entering _queue_task() for managed_node2/set_fact 28983 1726883028.45744: worker is 1 (out of 1 available) 28983 1726883028.45763: exiting _queue_task() for managed_node2/set_fact 28983 1726883028.45778: done queuing things up, now waiting for results queue to drain 28983 1726883028.45780: waiting for pending results... 28983 1726883028.46447: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 28983 1726883028.46586: in run() - task 0affe814-3a2d-b16d-c0a7-000000000f17 28983 1726883028.46603: variable 'ansible_search_path' from source: unknown 28983 1726883028.46608: variable 'ansible_search_path' from source: unknown 28983 1726883028.46767: calling self._execute() 28983 1726883028.47088: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883028.47093: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883028.47096: variable 'omit' from source: magic vars 28983 1726883028.47606: variable 'ansible_distribution_major_version' from source: facts 28983 1726883028.47620: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883028.47819: variable 'nm_profile_exists' from source: set_fact 28983 1726883028.47835: Evaluated conditional (nm_profile_exists.rc == 0): True 28983 1726883028.47842: variable 'omit' from source: magic vars 28983 1726883028.47924: variable 'omit' from source: magic vars 28983 1726883028.47978: variable 'omit' from source: magic vars 28983 1726883028.48024: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883028.48088: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883028.48104: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883028.48127: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883028.48196: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883028.48200: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883028.48206: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883028.48209: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883028.48325: Set connection var ansible_connection to ssh 28983 1726883028.48344: Set connection var ansible_shell_executable to /bin/sh 28983 1726883028.48355: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883028.48366: Set connection var ansible_timeout to 10 28983 1726883028.48376: Set connection var ansible_pipelining to False 28983 1726883028.48379: Set connection var ansible_shell_type to sh 28983 1726883028.48520: variable 'ansible_shell_executable' from source: unknown 28983 1726883028.48525: variable 'ansible_connection' from source: unknown 28983 1726883028.48528: variable 'ansible_module_compression' from source: unknown 28983 1726883028.48531: variable 'ansible_shell_type' from source: unknown 28983 1726883028.48537: variable 'ansible_shell_executable' from source: unknown 28983 1726883028.48540: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883028.48543: variable 'ansible_pipelining' from source: unknown 28983 1726883028.48546: variable 'ansible_timeout' from source: unknown 28983 1726883028.48549: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883028.48636: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883028.48650: variable 'omit' from source: magic vars 28983 1726883028.48657: starting attempt loop 28983 1726883028.48666: running the handler 28983 1726883028.48689: handler run complete 28983 1726883028.48703: attempt loop complete, returning result 28983 1726883028.48707: _execute() done 28983 1726883028.48710: dumping result to json 28983 1726883028.48715: done dumping result, returning 28983 1726883028.48739: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0affe814-3a2d-b16d-c0a7-000000000f17] 28983 1726883028.48742: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000f17 28983 1726883028.48905: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000f17 28983 1726883028.48909: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 28983 1726883028.48983: no more pending results, returning what we have 28983 1726883028.48992: results queue empty 28983 1726883028.48993: checking for any_errors_fatal 28983 1726883028.49003: done checking for any_errors_fatal 28983 1726883028.49004: checking for max_fail_percentage 28983 1726883028.49007: done checking for max_fail_percentage 28983 1726883028.49009: checking to see if all hosts have failed and the running result is not ok 28983 1726883028.49010: done checking to see if all hosts have failed 28983 1726883028.49011: getting the remaining hosts for this loop 28983 1726883028.49013: done getting the remaining hosts for this loop 28983 1726883028.49019: getting the next task for host managed_node2 28983 1726883028.49032: done getting next task for host managed_node2 28983 1726883028.49038: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 28983 1726883028.49045: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883028.49050: getting variables 28983 1726883028.49052: in VariableManager get_vars() 28983 1726883028.49095: Calling all_inventory to load vars for managed_node2 28983 1726883028.49099: Calling groups_inventory to load vars for managed_node2 28983 1726883028.49103: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883028.49115: Calling all_plugins_play to load vars for managed_node2 28983 1726883028.49120: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883028.49124: Calling groups_plugins_play to load vars for managed_node2 28983 1726883028.51847: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883028.55155: done with get_vars() 28983 1726883028.55195: done getting variables 28983 1726883028.55282: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 28983 1726883028.55463: variable 'profile' from source: play vars 28983 1726883028.55468: variable 'interface' from source: play vars 28983 1726883028.55651: variable 'interface' from source: play vars TASK [Get the ansible_managed comment in ifcfg-statebr] ************************ task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 21:43:48 -0400 (0:00:00.107) 0:00:58.554 ****** 28983 1726883028.55709: entering _queue_task() for managed_node2/command 28983 1726883028.56351: worker is 1 (out of 1 available) 28983 1726883028.56364: exiting _queue_task() for managed_node2/command 28983 1726883028.56376: done queuing things up, now waiting for results queue to drain 28983 1726883028.56378: waiting for pending results... 28983 1726883028.56564: running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-statebr 28983 1726883028.56624: in run() - task 0affe814-3a2d-b16d-c0a7-000000000f19 28983 1726883028.56640: variable 'ansible_search_path' from source: unknown 28983 1726883028.56643: variable 'ansible_search_path' from source: unknown 28983 1726883028.56686: calling self._execute() 28983 1726883028.56802: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883028.56809: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883028.56830: variable 'omit' from source: magic vars 28983 1726883028.57294: variable 'ansible_distribution_major_version' from source: facts 28983 1726883028.57311: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883028.57470: variable 'profile_stat' from source: set_fact 28983 1726883028.57496: Evaluated conditional (profile_stat.stat.exists): False 28983 1726883028.57500: when evaluation is False, skipping this task 28983 1726883028.57503: _execute() done 28983 1726883028.57531: dumping result to json 28983 1726883028.57534: done dumping result, returning 28983 1726883028.57539: done running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-statebr [0affe814-3a2d-b16d-c0a7-000000000f19] 28983 1726883028.57541: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000f19 28983 1726883028.57704: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000f19 skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 28983 1726883028.57763: no more pending results, returning what we have 28983 1726883028.57767: results queue empty 28983 1726883028.57768: checking for any_errors_fatal 28983 1726883028.57783: done checking for any_errors_fatal 28983 1726883028.57784: checking for max_fail_percentage 28983 1726883028.57786: done checking for max_fail_percentage 28983 1726883028.57787: checking to see if all hosts have failed and the running result is not ok 28983 1726883028.57788: done checking to see if all hosts have failed 28983 1726883028.57789: getting the remaining hosts for this loop 28983 1726883028.57791: done getting the remaining hosts for this loop 28983 1726883028.57796: getting the next task for host managed_node2 28983 1726883028.57806: done getting next task for host managed_node2 28983 1726883028.57809: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 28983 1726883028.57820: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883028.57826: getting variables 28983 1726883028.57828: in VariableManager get_vars() 28983 1726883028.57868: Calling all_inventory to load vars for managed_node2 28983 1726883028.57874: Calling groups_inventory to load vars for managed_node2 28983 1726883028.57879: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883028.57891: Calling all_plugins_play to load vars for managed_node2 28983 1726883028.57895: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883028.57900: Calling groups_plugins_play to load vars for managed_node2 28983 1726883028.58420: WORKER PROCESS EXITING 28983 1726883028.60298: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883028.63492: done with get_vars() 28983 1726883028.63528: done getting variables 28983 1726883028.63604: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 28983 1726883028.63732: variable 'profile' from source: play vars 28983 1726883028.63738: variable 'interface' from source: play vars 28983 1726883028.63809: variable 'interface' from source: play vars TASK [Verify the ansible_managed comment in ifcfg-statebr] ********************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 21:43:48 -0400 (0:00:00.081) 0:00:58.636 ****** 28983 1726883028.63855: entering _queue_task() for managed_node2/set_fact 28983 1726883028.64202: worker is 1 (out of 1 available) 28983 1726883028.64215: exiting _queue_task() for managed_node2/set_fact 28983 1726883028.64227: done queuing things up, now waiting for results queue to drain 28983 1726883028.64229: waiting for pending results... 28983 1726883028.64579: running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-statebr 28983 1726883028.64722: in run() - task 0affe814-3a2d-b16d-c0a7-000000000f1a 28983 1726883028.64784: variable 'ansible_search_path' from source: unknown 28983 1726883028.64788: variable 'ansible_search_path' from source: unknown 28983 1726883028.64806: calling self._execute() 28983 1726883028.64926: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883028.64944: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883028.65002: variable 'omit' from source: magic vars 28983 1726883028.65431: variable 'ansible_distribution_major_version' from source: facts 28983 1726883028.65639: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883028.65643: variable 'profile_stat' from source: set_fact 28983 1726883028.65645: Evaluated conditional (profile_stat.stat.exists): False 28983 1726883028.65647: when evaluation is False, skipping this task 28983 1726883028.65650: _execute() done 28983 1726883028.65652: dumping result to json 28983 1726883028.65655: done dumping result, returning 28983 1726883028.65665: done running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-statebr [0affe814-3a2d-b16d-c0a7-000000000f1a] 28983 1726883028.65677: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000f1a 28983 1726883028.65942: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000f1a 28983 1726883028.65946: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 28983 1726883028.65997: no more pending results, returning what we have 28983 1726883028.66001: results queue empty 28983 1726883028.66002: checking for any_errors_fatal 28983 1726883028.66009: done checking for any_errors_fatal 28983 1726883028.66010: checking for max_fail_percentage 28983 1726883028.66012: done checking for max_fail_percentage 28983 1726883028.66013: checking to see if all hosts have failed and the running result is not ok 28983 1726883028.66014: done checking to see if all hosts have failed 28983 1726883028.66015: getting the remaining hosts for this loop 28983 1726883028.66017: done getting the remaining hosts for this loop 28983 1726883028.66022: getting the next task for host managed_node2 28983 1726883028.66030: done getting next task for host managed_node2 28983 1726883028.66033: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 28983 1726883028.66040: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883028.66045: getting variables 28983 1726883028.66047: in VariableManager get_vars() 28983 1726883028.66088: Calling all_inventory to load vars for managed_node2 28983 1726883028.66092: Calling groups_inventory to load vars for managed_node2 28983 1726883028.66096: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883028.66109: Calling all_plugins_play to load vars for managed_node2 28983 1726883028.66114: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883028.66118: Calling groups_plugins_play to load vars for managed_node2 28983 1726883028.68555: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883028.71694: done with get_vars() 28983 1726883028.71729: done getting variables 28983 1726883028.71809: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 28983 1726883028.71943: variable 'profile' from source: play vars 28983 1726883028.71947: variable 'interface' from source: play vars 28983 1726883028.72022: variable 'interface' from source: play vars TASK [Get the fingerprint comment in ifcfg-statebr] **************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 21:43:48 -0400 (0:00:00.082) 0:00:58.718 ****** 28983 1726883028.72063: entering _queue_task() for managed_node2/command 28983 1726883028.72467: worker is 1 (out of 1 available) 28983 1726883028.72484: exiting _queue_task() for managed_node2/command 28983 1726883028.72497: done queuing things up, now waiting for results queue to drain 28983 1726883028.72499: waiting for pending results... 28983 1726883028.72801: running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-statebr 28983 1726883028.73045: in run() - task 0affe814-3a2d-b16d-c0a7-000000000f1b 28983 1726883028.73050: variable 'ansible_search_path' from source: unknown 28983 1726883028.73052: variable 'ansible_search_path' from source: unknown 28983 1726883028.73056: calling self._execute() 28983 1726883028.73168: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883028.73187: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883028.73210: variable 'omit' from source: magic vars 28983 1726883028.73685: variable 'ansible_distribution_major_version' from source: facts 28983 1726883028.73705: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883028.73880: variable 'profile_stat' from source: set_fact 28983 1726883028.73953: Evaluated conditional (profile_stat.stat.exists): False 28983 1726883028.73959: when evaluation is False, skipping this task 28983 1726883028.73962: _execute() done 28983 1726883028.73965: dumping result to json 28983 1726883028.73968: done dumping result, returning 28983 1726883028.73971: done running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-statebr [0affe814-3a2d-b16d-c0a7-000000000f1b] 28983 1726883028.73973: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000f1b skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 28983 1726883028.74404: no more pending results, returning what we have 28983 1726883028.74408: results queue empty 28983 1726883028.74409: checking for any_errors_fatal 28983 1726883028.74416: done checking for any_errors_fatal 28983 1726883028.74417: checking for max_fail_percentage 28983 1726883028.74419: done checking for max_fail_percentage 28983 1726883028.74420: checking to see if all hosts have failed and the running result is not ok 28983 1726883028.74421: done checking to see if all hosts have failed 28983 1726883028.74422: getting the remaining hosts for this loop 28983 1726883028.74424: done getting the remaining hosts for this loop 28983 1726883028.74428: getting the next task for host managed_node2 28983 1726883028.74437: done getting next task for host managed_node2 28983 1726883028.74446: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 28983 1726883028.74452: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883028.74457: getting variables 28983 1726883028.74458: in VariableManager get_vars() 28983 1726883028.74490: Calling all_inventory to load vars for managed_node2 28983 1726883028.74494: Calling groups_inventory to load vars for managed_node2 28983 1726883028.74498: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883028.74504: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000f1b 28983 1726883028.74507: WORKER PROCESS EXITING 28983 1726883028.74517: Calling all_plugins_play to load vars for managed_node2 28983 1726883028.74521: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883028.74526: Calling groups_plugins_play to load vars for managed_node2 28983 1726883028.76966: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883028.80090: done with get_vars() 28983 1726883028.80127: done getting variables 28983 1726883028.80202: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 28983 1726883028.80339: variable 'profile' from source: play vars 28983 1726883028.80343: variable 'interface' from source: play vars 28983 1726883028.80420: variable 'interface' from source: play vars TASK [Verify the fingerprint comment in ifcfg-statebr] ************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 21:43:48 -0400 (0:00:00.083) 0:00:58.802 ****** 28983 1726883028.80463: entering _queue_task() for managed_node2/set_fact 28983 1726883028.80827: worker is 1 (out of 1 available) 28983 1726883028.80851: exiting _queue_task() for managed_node2/set_fact 28983 1726883028.80865: done queuing things up, now waiting for results queue to drain 28983 1726883028.80867: waiting for pending results... 28983 1726883028.81188: running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-statebr 28983 1726883028.81296: in run() - task 0affe814-3a2d-b16d-c0a7-000000000f1c 28983 1726883028.81341: variable 'ansible_search_path' from source: unknown 28983 1726883028.81346: variable 'ansible_search_path' from source: unknown 28983 1726883028.81354: calling self._execute() 28983 1726883028.81542: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883028.81550: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883028.81616: variable 'omit' from source: magic vars 28983 1726883028.82091: variable 'ansible_distribution_major_version' from source: facts 28983 1726883028.82107: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883028.82293: variable 'profile_stat' from source: set_fact 28983 1726883028.82305: Evaluated conditional (profile_stat.stat.exists): False 28983 1726883028.82309: when evaluation is False, skipping this task 28983 1726883028.82312: _execute() done 28983 1726883028.82317: dumping result to json 28983 1726883028.82322: done dumping result, returning 28983 1726883028.82339: done running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-statebr [0affe814-3a2d-b16d-c0a7-000000000f1c] 28983 1726883028.82342: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000f1c 28983 1726883028.82460: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000f1c 28983 1726883028.82462: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 28983 1726883028.82547: no more pending results, returning what we have 28983 1726883028.82551: results queue empty 28983 1726883028.82552: checking for any_errors_fatal 28983 1726883028.82558: done checking for any_errors_fatal 28983 1726883028.82558: checking for max_fail_percentage 28983 1726883028.82561: done checking for max_fail_percentage 28983 1726883028.82562: checking to see if all hosts have failed and the running result is not ok 28983 1726883028.82563: done checking to see if all hosts have failed 28983 1726883028.82564: getting the remaining hosts for this loop 28983 1726883028.82566: done getting the remaining hosts for this loop 28983 1726883028.82571: getting the next task for host managed_node2 28983 1726883028.82580: done getting next task for host managed_node2 28983 1726883028.82583: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 28983 1726883028.82587: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883028.82593: getting variables 28983 1726883028.82595: in VariableManager get_vars() 28983 1726883028.82626: Calling all_inventory to load vars for managed_node2 28983 1726883028.82629: Calling groups_inventory to load vars for managed_node2 28983 1726883028.82633: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883028.82645: Calling all_plugins_play to load vars for managed_node2 28983 1726883028.82648: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883028.82651: Calling groups_plugins_play to load vars for managed_node2 28983 1726883028.84921: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883028.86983: done with get_vars() 28983 1726883028.87006: done getting variables 28983 1726883028.87057: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 28983 1726883028.87147: variable 'profile' from source: play vars 28983 1726883028.87150: variable 'interface' from source: play vars 28983 1726883028.87199: variable 'interface' from source: play vars TASK [Assert that the profile is present - 'statebr'] ************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Friday 20 September 2024 21:43:48 -0400 (0:00:00.067) 0:00:58.870 ****** 28983 1726883028.87224: entering _queue_task() for managed_node2/assert 28983 1726883028.87466: worker is 1 (out of 1 available) 28983 1726883028.87482: exiting _queue_task() for managed_node2/assert 28983 1726883028.87496: done queuing things up, now waiting for results queue to drain 28983 1726883028.87498: waiting for pending results... 28983 1726883028.87956: running TaskExecutor() for managed_node2/TASK: Assert that the profile is present - 'statebr' 28983 1726883028.87963: in run() - task 0affe814-3a2d-b16d-c0a7-000000000e8c 28983 1726883028.88086: variable 'ansible_search_path' from source: unknown 28983 1726883028.88090: variable 'ansible_search_path' from source: unknown 28983 1726883028.88093: calling self._execute() 28983 1726883028.88168: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883028.88193: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883028.88306: variable 'omit' from source: magic vars 28983 1726883028.88696: variable 'ansible_distribution_major_version' from source: facts 28983 1726883028.88709: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883028.88720: variable 'omit' from source: magic vars 28983 1726883028.88765: variable 'omit' from source: magic vars 28983 1726883028.88850: variable 'profile' from source: play vars 28983 1726883028.88854: variable 'interface' from source: play vars 28983 1726883028.88912: variable 'interface' from source: play vars 28983 1726883028.88928: variable 'omit' from source: magic vars 28983 1726883028.88970: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883028.89001: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883028.89021: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883028.89038: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883028.89048: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883028.89080: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883028.89085: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883028.89088: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883028.89173: Set connection var ansible_connection to ssh 28983 1726883028.89188: Set connection var ansible_shell_executable to /bin/sh 28983 1726883028.89196: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883028.89206: Set connection var ansible_timeout to 10 28983 1726883028.89211: Set connection var ansible_pipelining to False 28983 1726883028.89215: Set connection var ansible_shell_type to sh 28983 1726883028.89239: variable 'ansible_shell_executable' from source: unknown 28983 1726883028.89242: variable 'ansible_connection' from source: unknown 28983 1726883028.89245: variable 'ansible_module_compression' from source: unknown 28983 1726883028.89248: variable 'ansible_shell_type' from source: unknown 28983 1726883028.89253: variable 'ansible_shell_executable' from source: unknown 28983 1726883028.89256: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883028.89262: variable 'ansible_pipelining' from source: unknown 28983 1726883028.89265: variable 'ansible_timeout' from source: unknown 28983 1726883028.89272: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883028.89396: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883028.89409: variable 'omit' from source: magic vars 28983 1726883028.89414: starting attempt loop 28983 1726883028.89417: running the handler 28983 1726883028.89514: variable 'lsr_net_profile_exists' from source: set_fact 28983 1726883028.89518: Evaluated conditional (lsr_net_profile_exists): True 28983 1726883028.89524: handler run complete 28983 1726883028.89540: attempt loop complete, returning result 28983 1726883028.89545: _execute() done 28983 1726883028.89548: dumping result to json 28983 1726883028.89551: done dumping result, returning 28983 1726883028.89559: done running TaskExecutor() for managed_node2/TASK: Assert that the profile is present - 'statebr' [0affe814-3a2d-b16d-c0a7-000000000e8c] 28983 1726883028.89565: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000e8c 28983 1726883028.89659: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000e8c 28983 1726883028.89663: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 28983 1726883028.89722: no more pending results, returning what we have 28983 1726883028.89725: results queue empty 28983 1726883028.89726: checking for any_errors_fatal 28983 1726883028.89732: done checking for any_errors_fatal 28983 1726883028.89733: checking for max_fail_percentage 28983 1726883028.89737: done checking for max_fail_percentage 28983 1726883028.89738: checking to see if all hosts have failed and the running result is not ok 28983 1726883028.89740: done checking to see if all hosts have failed 28983 1726883028.89741: getting the remaining hosts for this loop 28983 1726883028.89743: done getting the remaining hosts for this loop 28983 1726883028.89747: getting the next task for host managed_node2 28983 1726883028.89754: done getting next task for host managed_node2 28983 1726883028.89757: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 28983 1726883028.89761: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883028.89765: getting variables 28983 1726883028.89766: in VariableManager get_vars() 28983 1726883028.89808: Calling all_inventory to load vars for managed_node2 28983 1726883028.89811: Calling groups_inventory to load vars for managed_node2 28983 1726883028.89815: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883028.89824: Calling all_plugins_play to load vars for managed_node2 28983 1726883028.89827: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883028.89831: Calling groups_plugins_play to load vars for managed_node2 28983 1726883028.91474: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883028.94481: done with get_vars() 28983 1726883028.94517: done getting variables 28983 1726883028.94589: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 28983 1726883028.94718: variable 'profile' from source: play vars 28983 1726883028.94722: variable 'interface' from source: play vars 28983 1726883028.94797: variable 'interface' from source: play vars TASK [Assert that the ansible managed comment is present in 'statebr'] ********* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Friday 20 September 2024 21:43:48 -0400 (0:00:00.076) 0:00:58.946 ****** 28983 1726883028.94839: entering _queue_task() for managed_node2/assert 28983 1726883028.95143: worker is 1 (out of 1 available) 28983 1726883028.95157: exiting _queue_task() for managed_node2/assert 28983 1726883028.95169: done queuing things up, now waiting for results queue to drain 28983 1726883028.95174: waiting for pending results... 28983 1726883028.95584: running TaskExecutor() for managed_node2/TASK: Assert that the ansible managed comment is present in 'statebr' 28983 1726883028.95656: in run() - task 0affe814-3a2d-b16d-c0a7-000000000e8d 28983 1726883028.95661: variable 'ansible_search_path' from source: unknown 28983 1726883028.95664: variable 'ansible_search_path' from source: unknown 28983 1726883028.95673: calling self._execute() 28983 1726883028.95788: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883028.95870: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883028.95874: variable 'omit' from source: magic vars 28983 1726883028.96246: variable 'ansible_distribution_major_version' from source: facts 28983 1726883028.96259: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883028.96266: variable 'omit' from source: magic vars 28983 1726883028.96334: variable 'omit' from source: magic vars 28983 1726883028.96460: variable 'profile' from source: play vars 28983 1726883028.96467: variable 'interface' from source: play vars 28983 1726883028.96550: variable 'interface' from source: play vars 28983 1726883028.96633: variable 'omit' from source: magic vars 28983 1726883028.96643: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883028.96668: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883028.96694: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883028.96717: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883028.96742: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883028.96770: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883028.96841: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883028.96847: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883028.96913: Set connection var ansible_connection to ssh 28983 1726883028.96927: Set connection var ansible_shell_executable to /bin/sh 28983 1726883028.96940: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883028.96951: Set connection var ansible_timeout to 10 28983 1726883028.96962: Set connection var ansible_pipelining to False 28983 1726883028.96972: Set connection var ansible_shell_type to sh 28983 1726883028.96998: variable 'ansible_shell_executable' from source: unknown 28983 1726883028.97002: variable 'ansible_connection' from source: unknown 28983 1726883028.97005: variable 'ansible_module_compression' from source: unknown 28983 1726883028.97008: variable 'ansible_shell_type' from source: unknown 28983 1726883028.97072: variable 'ansible_shell_executable' from source: unknown 28983 1726883028.97077: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883028.97080: variable 'ansible_pipelining' from source: unknown 28983 1726883028.97083: variable 'ansible_timeout' from source: unknown 28983 1726883028.97085: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883028.97207: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883028.97222: variable 'omit' from source: magic vars 28983 1726883028.97229: starting attempt loop 28983 1726883028.97232: running the handler 28983 1726883028.97398: variable 'lsr_net_profile_ansible_managed' from source: set_fact 28983 1726883028.97402: Evaluated conditional (lsr_net_profile_ansible_managed): True 28983 1726883028.97405: handler run complete 28983 1726883028.97420: attempt loop complete, returning result 28983 1726883028.97424: _execute() done 28983 1726883028.97426: dumping result to json 28983 1726883028.97429: done dumping result, returning 28983 1726883028.97540: done running TaskExecutor() for managed_node2/TASK: Assert that the ansible managed comment is present in 'statebr' [0affe814-3a2d-b16d-c0a7-000000000e8d] 28983 1726883028.97544: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000e8d 28983 1726883028.97607: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000e8d 28983 1726883028.97610: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 28983 1726883028.97665: no more pending results, returning what we have 28983 1726883028.97669: results queue empty 28983 1726883028.97670: checking for any_errors_fatal 28983 1726883028.97678: done checking for any_errors_fatal 28983 1726883028.97680: checking for max_fail_percentage 28983 1726883028.97683: done checking for max_fail_percentage 28983 1726883028.97684: checking to see if all hosts have failed and the running result is not ok 28983 1726883028.97685: done checking to see if all hosts have failed 28983 1726883028.97686: getting the remaining hosts for this loop 28983 1726883028.97688: done getting the remaining hosts for this loop 28983 1726883028.97693: getting the next task for host managed_node2 28983 1726883028.97700: done getting next task for host managed_node2 28983 1726883028.97703: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 28983 1726883028.97708: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883028.97713: getting variables 28983 1726883028.97715: in VariableManager get_vars() 28983 1726883028.97755: Calling all_inventory to load vars for managed_node2 28983 1726883028.97758: Calling groups_inventory to load vars for managed_node2 28983 1726883028.97763: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883028.97776: Calling all_plugins_play to load vars for managed_node2 28983 1726883028.97780: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883028.97784: Calling groups_plugins_play to load vars for managed_node2 28983 1726883029.00346: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883029.03352: done with get_vars() 28983 1726883029.03393: done getting variables 28983 1726883029.03467: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 28983 1726883029.03597: variable 'profile' from source: play vars 28983 1726883029.03601: variable 'interface' from source: play vars 28983 1726883029.03683: variable 'interface' from source: play vars TASK [Assert that the fingerprint comment is present in statebr] *************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Friday 20 September 2024 21:43:49 -0400 (0:00:00.088) 0:00:59.034 ****** 28983 1726883029.03720: entering _queue_task() for managed_node2/assert 28983 1726883029.04269: worker is 1 (out of 1 available) 28983 1726883029.04282: exiting _queue_task() for managed_node2/assert 28983 1726883029.04295: done queuing things up, now waiting for results queue to drain 28983 1726883029.04297: waiting for pending results... 28983 1726883029.04753: running TaskExecutor() for managed_node2/TASK: Assert that the fingerprint comment is present in statebr 28983 1726883029.04759: in run() - task 0affe814-3a2d-b16d-c0a7-000000000e8e 28983 1726883029.04764: variable 'ansible_search_path' from source: unknown 28983 1726883029.04768: variable 'ansible_search_path' from source: unknown 28983 1726883029.04772: calling self._execute() 28983 1726883029.04842: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883029.04847: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883029.04850: variable 'omit' from source: magic vars 28983 1726883029.05928: variable 'ansible_distribution_major_version' from source: facts 28983 1726883029.05932: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883029.05947: variable 'omit' from source: magic vars 28983 1726883029.06004: variable 'omit' from source: magic vars 28983 1726883029.06213: variable 'profile' from source: play vars 28983 1726883029.06220: variable 'interface' from source: play vars 28983 1726883029.06328: variable 'interface' from source: play vars 28983 1726883029.06350: variable 'omit' from source: magic vars 28983 1726883029.06405: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883029.06469: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883029.06473: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883029.06499: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883029.06578: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883029.06582: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883029.06585: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883029.06587: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883029.06694: Set connection var ansible_connection to ssh 28983 1726883029.06711: Set connection var ansible_shell_executable to /bin/sh 28983 1726883029.06723: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883029.06735: Set connection var ansible_timeout to 10 28983 1726883029.06743: Set connection var ansible_pipelining to False 28983 1726883029.06746: Set connection var ansible_shell_type to sh 28983 1726883029.06777: variable 'ansible_shell_executable' from source: unknown 28983 1726883029.06780: variable 'ansible_connection' from source: unknown 28983 1726883029.06783: variable 'ansible_module_compression' from source: unknown 28983 1726883029.06788: variable 'ansible_shell_type' from source: unknown 28983 1726883029.06795: variable 'ansible_shell_executable' from source: unknown 28983 1726883029.06797: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883029.06800: variable 'ansible_pipelining' from source: unknown 28983 1726883029.06905: variable 'ansible_timeout' from source: unknown 28983 1726883029.06908: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883029.06981: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883029.06995: variable 'omit' from source: magic vars 28983 1726883029.07001: starting attempt loop 28983 1726883029.07004: running the handler 28983 1726883029.07148: variable 'lsr_net_profile_fingerprint' from source: set_fact 28983 1726883029.07154: Evaluated conditional (lsr_net_profile_fingerprint): True 28983 1726883029.07162: handler run complete 28983 1726883029.07183: attempt loop complete, returning result 28983 1726883029.07186: _execute() done 28983 1726883029.07189: dumping result to json 28983 1726883029.07194: done dumping result, returning 28983 1726883029.07202: done running TaskExecutor() for managed_node2/TASK: Assert that the fingerprint comment is present in statebr [0affe814-3a2d-b16d-c0a7-000000000e8e] 28983 1726883029.07209: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000e8e 28983 1726883029.07312: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000e8e 28983 1726883029.07316: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 28983 1726883029.07377: no more pending results, returning what we have 28983 1726883029.07381: results queue empty 28983 1726883029.07382: checking for any_errors_fatal 28983 1726883029.07394: done checking for any_errors_fatal 28983 1726883029.07395: checking for max_fail_percentage 28983 1726883029.07397: done checking for max_fail_percentage 28983 1726883029.07399: checking to see if all hosts have failed and the running result is not ok 28983 1726883029.07400: done checking to see if all hosts have failed 28983 1726883029.07400: getting the remaining hosts for this loop 28983 1726883029.07403: done getting the remaining hosts for this loop 28983 1726883029.07408: getting the next task for host managed_node2 28983 1726883029.07419: done getting next task for host managed_node2 28983 1726883029.07423: ^ task is: TASK: Conditional asserts 28983 1726883029.07427: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883029.07436: getting variables 28983 1726883029.07438: in VariableManager get_vars() 28983 1726883029.07480: Calling all_inventory to load vars for managed_node2 28983 1726883029.07484: Calling groups_inventory to load vars for managed_node2 28983 1726883029.07489: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883029.07501: Calling all_plugins_play to load vars for managed_node2 28983 1726883029.07505: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883029.07510: Calling groups_plugins_play to load vars for managed_node2 28983 1726883029.10006: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883029.19121: done with get_vars() 28983 1726883029.19148: done getting variables TASK [Conditional asserts] ***************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:42 Friday 20 September 2024 21:43:49 -0400 (0:00:00.154) 0:00:59.189 ****** 28983 1726883029.19220: entering _queue_task() for managed_node2/include_tasks 28983 1726883029.19501: worker is 1 (out of 1 available) 28983 1726883029.19515: exiting _queue_task() for managed_node2/include_tasks 28983 1726883029.19528: done queuing things up, now waiting for results queue to drain 28983 1726883029.19531: waiting for pending results... 28983 1726883029.19727: running TaskExecutor() for managed_node2/TASK: Conditional asserts 28983 1726883029.19817: in run() - task 0affe814-3a2d-b16d-c0a7-000000000a4f 28983 1726883029.19832: variable 'ansible_search_path' from source: unknown 28983 1726883029.19840: variable 'ansible_search_path' from source: unknown 28983 1726883029.20157: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883029.22822: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883029.22876: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883029.22926: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883029.22962: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883029.22986: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883029.23061: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883029.23085: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883029.23107: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883029.23144: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883029.23157: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883029.23282: dumping result to json 28983 1726883029.23286: done dumping result, returning 28983 1726883029.23293: done running TaskExecutor() for managed_node2/TASK: Conditional asserts [0affe814-3a2d-b16d-c0a7-000000000a4f] 28983 1726883029.23299: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000a4f 28983 1726883029.23408: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000a4f 28983 1726883029.23411: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } 28983 1726883029.23476: no more pending results, returning what we have 28983 1726883029.23481: results queue empty 28983 1726883029.23482: checking for any_errors_fatal 28983 1726883029.23491: done checking for any_errors_fatal 28983 1726883029.23492: checking for max_fail_percentage 28983 1726883029.23493: done checking for max_fail_percentage 28983 1726883029.23494: checking to see if all hosts have failed and the running result is not ok 28983 1726883029.23496: done checking to see if all hosts have failed 28983 1726883029.23496: getting the remaining hosts for this loop 28983 1726883029.23498: done getting the remaining hosts for this loop 28983 1726883029.23503: getting the next task for host managed_node2 28983 1726883029.23511: done getting next task for host managed_node2 28983 1726883029.23514: ^ task is: TASK: Success in test '{{ lsr_description }}' 28983 1726883029.23517: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883029.23523: getting variables 28983 1726883029.23525: in VariableManager get_vars() 28983 1726883029.23561: Calling all_inventory to load vars for managed_node2 28983 1726883029.23565: Calling groups_inventory to load vars for managed_node2 28983 1726883029.23568: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883029.23580: Calling all_plugins_play to load vars for managed_node2 28983 1726883029.23584: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883029.23587: Calling groups_plugins_play to load vars for managed_node2 28983 1726883029.24852: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883029.27133: done with get_vars() 28983 1726883029.27170: done getting variables 28983 1726883029.27240: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 28983 1726883029.27385: variable 'lsr_description' from source: include params TASK [Success in test 'I can activate an existing profile'] ******************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:47 Friday 20 September 2024 21:43:49 -0400 (0:00:00.081) 0:00:59.271 ****** 28983 1726883029.27418: entering _queue_task() for managed_node2/debug 28983 1726883029.27728: worker is 1 (out of 1 available) 28983 1726883029.27750: exiting _queue_task() for managed_node2/debug 28983 1726883029.27765: done queuing things up, now waiting for results queue to drain 28983 1726883029.27767: waiting for pending results... 28983 1726883029.27985: running TaskExecutor() for managed_node2/TASK: Success in test 'I can activate an existing profile' 28983 1726883029.28075: in run() - task 0affe814-3a2d-b16d-c0a7-000000000a50 28983 1726883029.28090: variable 'ansible_search_path' from source: unknown 28983 1726883029.28094: variable 'ansible_search_path' from source: unknown 28983 1726883029.28133: calling self._execute() 28983 1726883029.28225: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883029.28235: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883029.28246: variable 'omit' from source: magic vars 28983 1726883029.28570: variable 'ansible_distribution_major_version' from source: facts 28983 1726883029.28583: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883029.28591: variable 'omit' from source: magic vars 28983 1726883029.28627: variable 'omit' from source: magic vars 28983 1726883029.28713: variable 'lsr_description' from source: include params 28983 1726883029.28730: variable 'omit' from source: magic vars 28983 1726883029.28767: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883029.28800: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883029.28819: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883029.28838: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883029.28849: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883029.28880: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883029.28884: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883029.28890: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883029.28976: Set connection var ansible_connection to ssh 28983 1726883029.28988: Set connection var ansible_shell_executable to /bin/sh 28983 1726883029.28997: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883029.29006: Set connection var ansible_timeout to 10 28983 1726883029.29012: Set connection var ansible_pipelining to False 28983 1726883029.29015: Set connection var ansible_shell_type to sh 28983 1726883029.29036: variable 'ansible_shell_executable' from source: unknown 28983 1726883029.29042: variable 'ansible_connection' from source: unknown 28983 1726883029.29045: variable 'ansible_module_compression' from source: unknown 28983 1726883029.29048: variable 'ansible_shell_type' from source: unknown 28983 1726883029.29051: variable 'ansible_shell_executable' from source: unknown 28983 1726883029.29057: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883029.29059: variable 'ansible_pipelining' from source: unknown 28983 1726883029.29064: variable 'ansible_timeout' from source: unknown 28983 1726883029.29069: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883029.29190: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883029.29200: variable 'omit' from source: magic vars 28983 1726883029.29207: starting attempt loop 28983 1726883029.29210: running the handler 28983 1726883029.29254: handler run complete 28983 1726883029.29267: attempt loop complete, returning result 28983 1726883029.29272: _execute() done 28983 1726883029.29278: dumping result to json 28983 1726883029.29282: done dumping result, returning 28983 1726883029.29290: done running TaskExecutor() for managed_node2/TASK: Success in test 'I can activate an existing profile' [0affe814-3a2d-b16d-c0a7-000000000a50] 28983 1726883029.29297: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000a50 28983 1726883029.29389: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000a50 28983 1726883029.29391: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: +++++ Success in test 'I can activate an existing profile' +++++ 28983 1726883029.29459: no more pending results, returning what we have 28983 1726883029.29463: results queue empty 28983 1726883029.29464: checking for any_errors_fatal 28983 1726883029.29468: done checking for any_errors_fatal 28983 1726883029.29469: checking for max_fail_percentage 28983 1726883029.29471: done checking for max_fail_percentage 28983 1726883029.29472: checking to see if all hosts have failed and the running result is not ok 28983 1726883029.29473: done checking to see if all hosts have failed 28983 1726883029.29474: getting the remaining hosts for this loop 28983 1726883029.29476: done getting the remaining hosts for this loop 28983 1726883029.29480: getting the next task for host managed_node2 28983 1726883029.29487: done getting next task for host managed_node2 28983 1726883029.29490: ^ task is: TASK: Cleanup 28983 1726883029.29493: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883029.29497: getting variables 28983 1726883029.29499: in VariableManager get_vars() 28983 1726883029.29541: Calling all_inventory to load vars for managed_node2 28983 1726883029.29544: Calling groups_inventory to load vars for managed_node2 28983 1726883029.29547: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883029.29556: Calling all_plugins_play to load vars for managed_node2 28983 1726883029.29560: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883029.29563: Calling groups_plugins_play to load vars for managed_node2 28983 1726883029.31267: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883029.33025: done with get_vars() 28983 1726883029.33048: done getting variables TASK [Cleanup] ***************************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:66 Friday 20 September 2024 21:43:49 -0400 (0:00:00.056) 0:00:59.328 ****** 28983 1726883029.33119: entering _queue_task() for managed_node2/include_tasks 28983 1726883029.33336: worker is 1 (out of 1 available) 28983 1726883029.33350: exiting _queue_task() for managed_node2/include_tasks 28983 1726883029.33363: done queuing things up, now waiting for results queue to drain 28983 1726883029.33365: waiting for pending results... 28983 1726883029.33565: running TaskExecutor() for managed_node2/TASK: Cleanup 28983 1726883029.33652: in run() - task 0affe814-3a2d-b16d-c0a7-000000000a54 28983 1726883029.33665: variable 'ansible_search_path' from source: unknown 28983 1726883029.33670: variable 'ansible_search_path' from source: unknown 28983 1726883029.33714: variable 'lsr_cleanup' from source: include params 28983 1726883029.33897: variable 'lsr_cleanup' from source: include params 28983 1726883029.33958: variable 'omit' from source: magic vars 28983 1726883029.34077: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883029.34087: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883029.34098: variable 'omit' from source: magic vars 28983 1726883029.34309: variable 'ansible_distribution_major_version' from source: facts 28983 1726883029.34319: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883029.34325: variable 'item' from source: unknown 28983 1726883029.34386: variable 'item' from source: unknown 28983 1726883029.34411: variable 'item' from source: unknown 28983 1726883029.34466: variable 'item' from source: unknown 28983 1726883029.34612: dumping result to json 28983 1726883029.34615: done dumping result, returning 28983 1726883029.34618: done running TaskExecutor() for managed_node2/TASK: Cleanup [0affe814-3a2d-b16d-c0a7-000000000a54] 28983 1726883029.34620: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000a54 28983 1726883029.34668: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000a54 28983 1726883029.34673: WORKER PROCESS EXITING 28983 1726883029.34698: no more pending results, returning what we have 28983 1726883029.34703: in VariableManager get_vars() 28983 1726883029.34738: Calling all_inventory to load vars for managed_node2 28983 1726883029.34741: Calling groups_inventory to load vars for managed_node2 28983 1726883029.34744: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883029.34754: Calling all_plugins_play to load vars for managed_node2 28983 1726883029.34757: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883029.34761: Calling groups_plugins_play to load vars for managed_node2 28983 1726883029.35985: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883029.37703: done with get_vars() 28983 1726883029.37726: variable 'ansible_search_path' from source: unknown 28983 1726883029.37728: variable 'ansible_search_path' from source: unknown 28983 1726883029.37759: we have included files to process 28983 1726883029.37760: generating all_blocks data 28983 1726883029.37761: done generating all_blocks data 28983 1726883029.37766: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 28983 1726883029.37767: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 28983 1726883029.37768: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 28983 1726883029.37922: done processing included file 28983 1726883029.37924: iterating over new_blocks loaded from include file 28983 1726883029.37925: in VariableManager get_vars() 28983 1726883029.37941: done with get_vars() 28983 1726883029.37942: filtering new block on tags 28983 1726883029.37963: done filtering new block on tags 28983 1726883029.37965: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml for managed_node2 => (item=tasks/cleanup_profile+device.yml) 28983 1726883029.37969: extending task lists for all hosts with included blocks 28983 1726883029.39031: done extending task lists 28983 1726883029.39032: done processing included files 28983 1726883029.39033: results queue empty 28983 1726883029.39035: checking for any_errors_fatal 28983 1726883029.39038: done checking for any_errors_fatal 28983 1726883029.39038: checking for max_fail_percentage 28983 1726883029.39039: done checking for max_fail_percentage 28983 1726883029.39040: checking to see if all hosts have failed and the running result is not ok 28983 1726883029.39041: done checking to see if all hosts have failed 28983 1726883029.39041: getting the remaining hosts for this loop 28983 1726883029.39042: done getting the remaining hosts for this loop 28983 1726883029.39044: getting the next task for host managed_node2 28983 1726883029.39048: done getting next task for host managed_node2 28983 1726883029.39049: ^ task is: TASK: Cleanup profile and device 28983 1726883029.39051: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883029.39053: getting variables 28983 1726883029.39054: in VariableManager get_vars() 28983 1726883029.39062: Calling all_inventory to load vars for managed_node2 28983 1726883029.39063: Calling groups_inventory to load vars for managed_node2 28983 1726883029.39065: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883029.39069: Calling all_plugins_play to load vars for managed_node2 28983 1726883029.39073: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883029.39075: Calling groups_plugins_play to load vars for managed_node2 28983 1726883029.40700: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883029.42350: done with get_vars() 28983 1726883029.42373: done getting variables 28983 1726883029.42406: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Cleanup profile and device] ********************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml:3 Friday 20 September 2024 21:43:49 -0400 (0:00:00.093) 0:00:59.422 ****** 28983 1726883029.42429: entering _queue_task() for managed_node2/shell 28983 1726883029.42687: worker is 1 (out of 1 available) 28983 1726883029.42701: exiting _queue_task() for managed_node2/shell 28983 1726883029.42720: done queuing things up, now waiting for results queue to drain 28983 1726883029.42722: waiting for pending results... 28983 1726883029.43451: running TaskExecutor() for managed_node2/TASK: Cleanup profile and device 28983 1726883029.43457: in run() - task 0affe814-3a2d-b16d-c0a7-000000000f6d 28983 1726883029.43459: variable 'ansible_search_path' from source: unknown 28983 1726883029.43461: variable 'ansible_search_path' from source: unknown 28983 1726883029.43464: calling self._execute() 28983 1726883029.43467: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883029.43468: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883029.43470: variable 'omit' from source: magic vars 28983 1726883029.43842: variable 'ansible_distribution_major_version' from source: facts 28983 1726883029.43861: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883029.43872: variable 'omit' from source: magic vars 28983 1726883029.43931: variable 'omit' from source: magic vars 28983 1726883029.44115: variable 'interface' from source: play vars 28983 1726883029.44147: variable 'omit' from source: magic vars 28983 1726883029.44198: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883029.44248: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883029.44279: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883029.44305: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883029.44322: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883029.44363: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883029.44373: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883029.44382: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883029.44510: Set connection var ansible_connection to ssh 28983 1726883029.44528: Set connection var ansible_shell_executable to /bin/sh 28983 1726883029.44545: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883029.44560: Set connection var ansible_timeout to 10 28983 1726883029.44572: Set connection var ansible_pipelining to False 28983 1726883029.44580: Set connection var ansible_shell_type to sh 28983 1726883029.44612: variable 'ansible_shell_executable' from source: unknown 28983 1726883029.44622: variable 'ansible_connection' from source: unknown 28983 1726883029.44629: variable 'ansible_module_compression' from source: unknown 28983 1726883029.44639: variable 'ansible_shell_type' from source: unknown 28983 1726883029.44647: variable 'ansible_shell_executable' from source: unknown 28983 1726883029.44654: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883029.44665: variable 'ansible_pipelining' from source: unknown 28983 1726883029.44672: variable 'ansible_timeout' from source: unknown 28983 1726883029.44681: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883029.44844: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883029.44863: variable 'omit' from source: magic vars 28983 1726883029.44873: starting attempt loop 28983 1726883029.44883: running the handler 28983 1726883029.44898: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883029.44925: _low_level_execute_command(): starting 28983 1726883029.44942: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726883029.46565: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883029.46602: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726883029.46606: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883029.46609: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883029.46611: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883029.46777: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883029.46781: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883029.46807: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883029.47120: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883029.48895: stdout chunk (state=3): >>>/root <<< 28983 1726883029.48999: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883029.49075: stderr chunk (state=3): >>><<< 28983 1726883029.49079: stdout chunk (state=3): >>><<< 28983 1726883029.49243: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883029.49248: _low_level_execute_command(): starting 28983 1726883029.49251: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726883029.4910562-31074-174234843461948 `" && echo ansible-tmp-1726883029.4910562-31074-174234843461948="` echo /root/.ansible/tmp/ansible-tmp-1726883029.4910562-31074-174234843461948 `" ) && sleep 0' 28983 1726883029.50204: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883029.50215: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 28983 1726883029.50229: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.46.139 is address <<< 28983 1726883029.50244: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28983 1726883029.50340: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883029.50361: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883029.50467: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883029.52465: stdout chunk (state=3): >>>ansible-tmp-1726883029.4910562-31074-174234843461948=/root/.ansible/tmp/ansible-tmp-1726883029.4910562-31074-174234843461948 <<< 28983 1726883029.52668: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883029.52672: stdout chunk (state=3): >>><<< 28983 1726883029.52674: stderr chunk (state=3): >>><<< 28983 1726883029.52840: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726883029.4910562-31074-174234843461948=/root/.ansible/tmp/ansible-tmp-1726883029.4910562-31074-174234843461948 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883029.52843: variable 'ansible_module_compression' from source: unknown 28983 1726883029.52846: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 28983 1726883029.52848: variable 'ansible_facts' from source: unknown 28983 1726883029.52940: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726883029.4910562-31074-174234843461948/AnsiballZ_command.py 28983 1726883029.53095: Sending initial data 28983 1726883029.53196: Sent initial data (156 bytes) 28983 1726883029.53775: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883029.53857: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883029.53915: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883029.53944: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883029.54049: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883029.55696: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726883029.55794: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726883029.55880: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpwu8vt7ew /root/.ansible/tmp/ansible-tmp-1726883029.4910562-31074-174234843461948/AnsiballZ_command.py <<< 28983 1726883029.55884: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726883029.4910562-31074-174234843461948/AnsiballZ_command.py" <<< 28983 1726883029.55966: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpwu8vt7ew" to remote "/root/.ansible/tmp/ansible-tmp-1726883029.4910562-31074-174234843461948/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726883029.4910562-31074-174234843461948/AnsiballZ_command.py" <<< 28983 1726883029.57803: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883029.57807: stderr chunk (state=3): >>><<< 28983 1726883029.57809: stdout chunk (state=3): >>><<< 28983 1726883029.57811: done transferring module to remote 28983 1726883029.57822: _low_level_execute_command(): starting 28983 1726883029.57832: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726883029.4910562-31074-174234843461948/ /root/.ansible/tmp/ansible-tmp-1726883029.4910562-31074-174234843461948/AnsiballZ_command.py && sleep 0' 28983 1726883029.58566: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883029.58706: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883029.58748: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883029.58854: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883029.60849: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883029.60853: stdout chunk (state=3): >>><<< 28983 1726883029.60855: stderr chunk (state=3): >>><<< 28983 1726883029.60985: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883029.60989: _low_level_execute_command(): starting 28983 1726883029.60992: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726883029.4910562-31074-174234843461948/AnsiballZ_command.py && sleep 0' 28983 1726883029.61584: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883029.61603: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883029.61620: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883029.61655: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883029.61714: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883029.61718: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883029.61807: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883029.61869: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883029.61958: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883029.84968: stdout chunk (state=3): >>> {"changed": true, "stdout": "Connection 'statebr' (f251b268-4387-4b61-a766-95deb90f678a) successfully deleted.", "stderr": "Could not load file '/etc/sysconfig/network-scripts/ifcfg-statebr'", "rc": 0, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "start": "2024-09-20 21:43:49.792756", "end": "2024-09-20 21:43:49.844381", "delta": "0:00:00.051625", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 28983 1726883029.87125: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883029.87147: stderr chunk (state=3): >>>Shared connection to 10.31.46.139 closed. <<< 28983 1726883029.87248: stderr chunk (state=3): >>><<< 28983 1726883029.87270: stdout chunk (state=3): >>><<< 28983 1726883029.87440: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "Connection 'statebr' (f251b268-4387-4b61-a766-95deb90f678a) successfully deleted.", "stderr": "Could not load file '/etc/sysconfig/network-scripts/ifcfg-statebr'", "rc": 0, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "start": "2024-09-20 21:43:49.792756", "end": "2024-09-20 21:43:49.844381", "delta": "0:00:00.051625", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726883029.87443: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726883029.4910562-31074-174234843461948/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726883029.87446: _low_level_execute_command(): starting 28983 1726883029.87449: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726883029.4910562-31074-174234843461948/ > /dev/null 2>&1 && sleep 0' 28983 1726883029.88013: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883029.88028: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883029.88047: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883029.88068: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883029.88092: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726883029.88105: stderr chunk (state=3): >>>debug2: match not found <<< 28983 1726883029.88119: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883029.88142: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28983 1726883029.88245: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883029.88259: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883029.88362: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883029.90353: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883029.90466: stderr chunk (state=3): >>><<< 28983 1726883029.90470: stdout chunk (state=3): >>><<< 28983 1726883029.90493: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883029.90512: handler run complete 28983 1726883029.90563: Evaluated conditional (False): False 28983 1726883029.90580: attempt loop complete, returning result 28983 1726883029.90583: _execute() done 28983 1726883029.90586: dumping result to json 28983 1726883029.90594: done dumping result, returning 28983 1726883029.90608: done running TaskExecutor() for managed_node2/TASK: Cleanup profile and device [0affe814-3a2d-b16d-c0a7-000000000f6d] 28983 1726883029.90622: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000f6d ok: [managed_node2] => { "changed": false, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "delta": "0:00:00.051625", "end": "2024-09-20 21:43:49.844381", "rc": 0, "start": "2024-09-20 21:43:49.792756" } STDOUT: Connection 'statebr' (f251b268-4387-4b61-a766-95deb90f678a) successfully deleted. STDERR: Could not load file '/etc/sysconfig/network-scripts/ifcfg-statebr' 28983 1726883029.90860: no more pending results, returning what we have 28983 1726883029.90864: results queue empty 28983 1726883029.90866: checking for any_errors_fatal 28983 1726883029.90868: done checking for any_errors_fatal 28983 1726883029.90869: checking for max_fail_percentage 28983 1726883029.90874: done checking for max_fail_percentage 28983 1726883029.90875: checking to see if all hosts have failed and the running result is not ok 28983 1726883029.90876: done checking to see if all hosts have failed 28983 1726883029.90877: getting the remaining hosts for this loop 28983 1726883029.90881: done getting the remaining hosts for this loop 28983 1726883029.90886: getting the next task for host managed_node2 28983 1726883029.90949: done getting next task for host managed_node2 28983 1726883029.90954: ^ task is: TASK: Include the task 'run_test.yml' 28983 1726883029.90962: ^ state is: HOST STATE: block=6, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883029.91008: getting variables 28983 1726883029.91011: in VariableManager get_vars() 28983 1726883029.91102: Calling all_inventory to load vars for managed_node2 28983 1726883029.91106: Calling groups_inventory to load vars for managed_node2 28983 1726883029.91110: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883029.91125: Calling all_plugins_play to load vars for managed_node2 28983 1726883029.91130: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883029.91251: Calling groups_plugins_play to load vars for managed_node2 28983 1726883029.91874: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000f6d 28983 1726883029.91878: WORKER PROCESS EXITING 28983 1726883029.93989: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883029.97049: done with get_vars() 28983 1726883029.97089: done getting variables TASK [Include the task 'run_test.yml'] ***************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_states.yml:83 Friday 20 September 2024 21:43:49 -0400 (0:00:00.547) 0:00:59.969 ****** 28983 1726883029.97208: entering _queue_task() for managed_node2/include_tasks 28983 1726883029.97569: worker is 1 (out of 1 available) 28983 1726883029.97585: exiting _queue_task() for managed_node2/include_tasks 28983 1726883029.97598: done queuing things up, now waiting for results queue to drain 28983 1726883029.97600: waiting for pending results... 28983 1726883029.97914: running TaskExecutor() for managed_node2/TASK: Include the task 'run_test.yml' 28983 1726883029.98060: in run() - task 0affe814-3a2d-b16d-c0a7-000000000013 28983 1726883029.98066: variable 'ansible_search_path' from source: unknown 28983 1726883029.98113: calling self._execute() 28983 1726883029.98278: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883029.98281: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883029.98284: variable 'omit' from source: magic vars 28983 1726883029.98757: variable 'ansible_distribution_major_version' from source: facts 28983 1726883029.98782: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883029.98793: _execute() done 28983 1726883029.98802: dumping result to json 28983 1726883029.98811: done dumping result, returning 28983 1726883029.99041: done running TaskExecutor() for managed_node2/TASK: Include the task 'run_test.yml' [0affe814-3a2d-b16d-c0a7-000000000013] 28983 1726883029.99045: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000013 28983 1726883029.99138: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000013 28983 1726883029.99141: WORKER PROCESS EXITING 28983 1726883029.99170: no more pending results, returning what we have 28983 1726883029.99177: in VariableManager get_vars() 28983 1726883029.99220: Calling all_inventory to load vars for managed_node2 28983 1726883029.99223: Calling groups_inventory to load vars for managed_node2 28983 1726883029.99226: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883029.99240: Calling all_plugins_play to load vars for managed_node2 28983 1726883029.99244: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883029.99248: Calling groups_plugins_play to load vars for managed_node2 28983 1726883030.01484: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883030.04600: done with get_vars() 28983 1726883030.04629: variable 'ansible_search_path' from source: unknown 28983 1726883030.04648: we have included files to process 28983 1726883030.04649: generating all_blocks data 28983 1726883030.04651: done generating all_blocks data 28983 1726883030.04658: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 28983 1726883030.04659: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 28983 1726883030.04661: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 28983 1726883030.05192: in VariableManager get_vars() 28983 1726883030.05215: done with get_vars() 28983 1726883030.05274: in VariableManager get_vars() 28983 1726883030.05296: done with get_vars() 28983 1726883030.05349: in VariableManager get_vars() 28983 1726883030.05374: done with get_vars() 28983 1726883030.05429: in VariableManager get_vars() 28983 1726883030.05452: done with get_vars() 28983 1726883030.05510: in VariableManager get_vars() 28983 1726883030.05531: done with get_vars() 28983 1726883030.06216: in VariableManager get_vars() 28983 1726883030.06248: done with get_vars() 28983 1726883030.06268: done processing included file 28983 1726883030.06273: iterating over new_blocks loaded from include file 28983 1726883030.06275: in VariableManager get_vars() 28983 1726883030.06294: done with get_vars() 28983 1726883030.06296: filtering new block on tags 28983 1726883030.06704: done filtering new block on tags 28983 1726883030.06707: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml for managed_node2 28983 1726883030.06714: extending task lists for all hosts with included blocks 28983 1726883030.06805: done extending task lists 28983 1726883030.06806: done processing included files 28983 1726883030.06807: results queue empty 28983 1726883030.06808: checking for any_errors_fatal 28983 1726883030.06816: done checking for any_errors_fatal 28983 1726883030.06817: checking for max_fail_percentage 28983 1726883030.06818: done checking for max_fail_percentage 28983 1726883030.06819: checking to see if all hosts have failed and the running result is not ok 28983 1726883030.06820: done checking to see if all hosts have failed 28983 1726883030.06821: getting the remaining hosts for this loop 28983 1726883030.06823: done getting the remaining hosts for this loop 28983 1726883030.06826: getting the next task for host managed_node2 28983 1726883030.06831: done getting next task for host managed_node2 28983 1726883030.06836: ^ task is: TASK: TEST: {{ lsr_description }} 28983 1726883030.06839: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883030.06841: getting variables 28983 1726883030.06842: in VariableManager get_vars() 28983 1726883030.06854: Calling all_inventory to load vars for managed_node2 28983 1726883030.06857: Calling groups_inventory to load vars for managed_node2 28983 1726883030.06860: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883030.06867: Calling all_plugins_play to load vars for managed_node2 28983 1726883030.06870: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883030.06877: Calling groups_plugins_play to load vars for managed_node2 28983 1726883030.10377: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883030.13493: done with get_vars() 28983 1726883030.13540: done getting variables 28983 1726883030.13628: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 28983 1726883030.13766: variable 'lsr_description' from source: include params TASK [TEST: I can remove an existing profile without taking it down] *********** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:5 Friday 20 September 2024 21:43:50 -0400 (0:00:00.165) 0:01:00.135 ****** 28983 1726883030.13801: entering _queue_task() for managed_node2/debug 28983 1726883030.14214: worker is 1 (out of 1 available) 28983 1726883030.14238: exiting _queue_task() for managed_node2/debug 28983 1726883030.14253: done queuing things up, now waiting for results queue to drain 28983 1726883030.14255: waiting for pending results... 28983 1726883030.14559: running TaskExecutor() for managed_node2/TASK: TEST: I can remove an existing profile without taking it down 28983 1726883030.14710: in run() - task 0affe814-3a2d-b16d-c0a7-000000001005 28983 1726883030.14715: variable 'ansible_search_path' from source: unknown 28983 1726883030.14718: variable 'ansible_search_path' from source: unknown 28983 1726883030.14757: calling self._execute() 28983 1726883030.14875: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883030.14881: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883030.14895: variable 'omit' from source: magic vars 28983 1726883030.15335: variable 'ansible_distribution_major_version' from source: facts 28983 1726883030.15350: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883030.15364: variable 'omit' from source: magic vars 28983 1726883030.15411: variable 'omit' from source: magic vars 28983 1726883030.15530: variable 'lsr_description' from source: include params 28983 1726883030.15554: variable 'omit' from source: magic vars 28983 1726883030.15846: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883030.15903: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883030.15931: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883030.16225: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883030.16229: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883030.16231: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883030.16236: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883030.16238: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883030.16662: Set connection var ansible_connection to ssh 28983 1726883030.16749: Set connection var ansible_shell_executable to /bin/sh 28983 1726883030.16900: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883030.16906: Set connection var ansible_timeout to 10 28983 1726883030.16909: Set connection var ansible_pipelining to False 28983 1726883030.17010: Set connection var ansible_shell_type to sh 28983 1726883030.17013: variable 'ansible_shell_executable' from source: unknown 28983 1726883030.17018: variable 'ansible_connection' from source: unknown 28983 1726883030.17021: variable 'ansible_module_compression' from source: unknown 28983 1726883030.17024: variable 'ansible_shell_type' from source: unknown 28983 1726883030.17147: variable 'ansible_shell_executable' from source: unknown 28983 1726883030.17151: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883030.17154: variable 'ansible_pipelining' from source: unknown 28983 1726883030.17294: variable 'ansible_timeout' from source: unknown 28983 1726883030.17298: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883030.17710: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883030.17836: variable 'omit' from source: magic vars 28983 1726883030.17868: starting attempt loop 28983 1726883030.17896: running the handler 28983 1726883030.18022: handler run complete 28983 1726883030.18189: attempt loop complete, returning result 28983 1726883030.18195: _execute() done 28983 1726883030.18198: dumping result to json 28983 1726883030.18200: done dumping result, returning 28983 1726883030.18203: done running TaskExecutor() for managed_node2/TASK: TEST: I can remove an existing profile without taking it down [0affe814-3a2d-b16d-c0a7-000000001005] 28983 1726883030.18274: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001005 ok: [managed_node2] => {} MSG: ########## I can remove an existing profile without taking it down ########## 28983 1726883030.18584: no more pending results, returning what we have 28983 1726883030.18588: results queue empty 28983 1726883030.18590: checking for any_errors_fatal 28983 1726883030.18593: done checking for any_errors_fatal 28983 1726883030.18594: checking for max_fail_percentage 28983 1726883030.18596: done checking for max_fail_percentage 28983 1726883030.18598: checking to see if all hosts have failed and the running result is not ok 28983 1726883030.18599: done checking to see if all hosts have failed 28983 1726883030.18600: getting the remaining hosts for this loop 28983 1726883030.18602: done getting the remaining hosts for this loop 28983 1726883030.18608: getting the next task for host managed_node2 28983 1726883030.18617: done getting next task for host managed_node2 28983 1726883030.18620: ^ task is: TASK: Show item 28983 1726883030.18625: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883030.18630: getting variables 28983 1726883030.18632: in VariableManager get_vars() 28983 1726883030.18682: Calling all_inventory to load vars for managed_node2 28983 1726883030.18686: Calling groups_inventory to load vars for managed_node2 28983 1726883030.18691: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883030.18704: Calling all_plugins_play to load vars for managed_node2 28983 1726883030.18709: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883030.18714: Calling groups_plugins_play to load vars for managed_node2 28983 1726883030.19276: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001005 28983 1726883030.19279: WORKER PROCESS EXITING 28983 1726883030.20956: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883030.24045: done with get_vars() 28983 1726883030.24090: done getting variables 28983 1726883030.24162: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show item] *************************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:9 Friday 20 September 2024 21:43:50 -0400 (0:00:00.103) 0:01:00.239 ****** 28983 1726883030.24202: entering _queue_task() for managed_node2/debug 28983 1726883030.24560: worker is 1 (out of 1 available) 28983 1726883030.24575: exiting _queue_task() for managed_node2/debug 28983 1726883030.24589: done queuing things up, now waiting for results queue to drain 28983 1726883030.24592: waiting for pending results... 28983 1726883030.24916: running TaskExecutor() for managed_node2/TASK: Show item 28983 1726883030.25018: in run() - task 0affe814-3a2d-b16d-c0a7-000000001006 28983 1726883030.25033: variable 'ansible_search_path' from source: unknown 28983 1726883030.25046: variable 'ansible_search_path' from source: unknown 28983 1726883030.25093: variable 'omit' from source: magic vars 28983 1726883030.25306: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883030.25310: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883030.25313: variable 'omit' from source: magic vars 28983 1726883030.25840: variable 'ansible_distribution_major_version' from source: facts 28983 1726883030.25844: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883030.25847: variable 'omit' from source: magic vars 28983 1726883030.25850: variable 'omit' from source: magic vars 28983 1726883030.26339: variable 'item' from source: unknown 28983 1726883030.26343: variable 'item' from source: unknown 28983 1726883030.26346: variable 'omit' from source: magic vars 28983 1726883030.26348: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883030.26351: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883030.26354: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883030.26356: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883030.26359: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883030.26361: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883030.26363: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883030.26365: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883030.26464: Set connection var ansible_connection to ssh 28983 1726883030.26486: Set connection var ansible_shell_executable to /bin/sh 28983 1726883030.26489: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883030.26501: Set connection var ansible_timeout to 10 28983 1726883030.26508: Set connection var ansible_pipelining to False 28983 1726883030.26510: Set connection var ansible_shell_type to sh 28983 1726883030.26535: variable 'ansible_shell_executable' from source: unknown 28983 1726883030.26539: variable 'ansible_connection' from source: unknown 28983 1726883030.26542: variable 'ansible_module_compression' from source: unknown 28983 1726883030.26546: variable 'ansible_shell_type' from source: unknown 28983 1726883030.26549: variable 'ansible_shell_executable' from source: unknown 28983 1726883030.26555: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883030.26560: variable 'ansible_pipelining' from source: unknown 28983 1726883030.26563: variable 'ansible_timeout' from source: unknown 28983 1726883030.26569: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883030.26769: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883030.26836: variable 'omit' from source: magic vars 28983 1726883030.26840: starting attempt loop 28983 1726883030.26843: running the handler 28983 1726883030.26845: variable 'lsr_description' from source: include params 28983 1726883030.26940: variable 'lsr_description' from source: include params 28983 1726883030.26949: handler run complete 28983 1726883030.26962: attempt loop complete, returning result 28983 1726883030.26982: variable 'item' from source: unknown 28983 1726883030.27079: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_description) => { "ansible_loop_var": "item", "item": "lsr_description", "lsr_description": "I can remove an existing profile without taking it down" } 28983 1726883030.27304: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883030.27307: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883030.27310: variable 'omit' from source: magic vars 28983 1726883030.27944: variable 'ansible_distribution_major_version' from source: facts 28983 1726883030.27949: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883030.27952: variable 'omit' from source: magic vars 28983 1726883030.27954: variable 'omit' from source: magic vars 28983 1726883030.27957: variable 'item' from source: unknown 28983 1726883030.27959: variable 'item' from source: unknown 28983 1726883030.27961: variable 'omit' from source: magic vars 28983 1726883030.27964: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883030.27966: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883030.27969: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883030.27971: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883030.27973: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883030.27975: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883030.27977: Set connection var ansible_connection to ssh 28983 1726883030.27980: Set connection var ansible_shell_executable to /bin/sh 28983 1726883030.27982: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883030.27984: Set connection var ansible_timeout to 10 28983 1726883030.27986: Set connection var ansible_pipelining to False 28983 1726883030.27988: Set connection var ansible_shell_type to sh 28983 1726883030.27990: variable 'ansible_shell_executable' from source: unknown 28983 1726883030.27992: variable 'ansible_connection' from source: unknown 28983 1726883030.27995: variable 'ansible_module_compression' from source: unknown 28983 1726883030.27996: variable 'ansible_shell_type' from source: unknown 28983 1726883030.27999: variable 'ansible_shell_executable' from source: unknown 28983 1726883030.28001: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883030.28003: variable 'ansible_pipelining' from source: unknown 28983 1726883030.28005: variable 'ansible_timeout' from source: unknown 28983 1726883030.28006: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883030.28009: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883030.28011: variable 'omit' from source: magic vars 28983 1726883030.28013: starting attempt loop 28983 1726883030.28015: running the handler 28983 1726883030.28017: variable 'lsr_setup' from source: include params 28983 1726883030.28133: variable 'lsr_setup' from source: include params 28983 1726883030.28139: handler run complete 28983 1726883030.28142: attempt loop complete, returning result 28983 1726883030.28144: variable 'item' from source: unknown 28983 1726883030.28242: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_setup) => { "ansible_loop_var": "item", "item": "lsr_setup", "lsr_setup": [ "tasks/create_bridge_profile.yml", "tasks/activate_profile.yml" ] } 28983 1726883030.28502: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883030.28505: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883030.28508: variable 'omit' from source: magic vars 28983 1726883030.28510: variable 'ansible_distribution_major_version' from source: facts 28983 1726883030.28512: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883030.28515: variable 'omit' from source: magic vars 28983 1726883030.28517: variable 'omit' from source: magic vars 28983 1726883030.28569: variable 'item' from source: unknown 28983 1726883030.28719: variable 'item' from source: unknown 28983 1726883030.28722: variable 'omit' from source: magic vars 28983 1726883030.28725: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883030.28727: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883030.28730: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883030.28732: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883030.28736: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883030.28738: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883030.28789: Set connection var ansible_connection to ssh 28983 1726883030.28800: Set connection var ansible_shell_executable to /bin/sh 28983 1726883030.28813: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883030.28823: Set connection var ansible_timeout to 10 28983 1726883030.28829: Set connection var ansible_pipelining to False 28983 1726883030.28832: Set connection var ansible_shell_type to sh 28983 1726883030.28857: variable 'ansible_shell_executable' from source: unknown 28983 1726883030.28860: variable 'ansible_connection' from source: unknown 28983 1726883030.28863: variable 'ansible_module_compression' from source: unknown 28983 1726883030.28868: variable 'ansible_shell_type' from source: unknown 28983 1726883030.28870: variable 'ansible_shell_executable' from source: unknown 28983 1726883030.28878: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883030.28884: variable 'ansible_pipelining' from source: unknown 28983 1726883030.28894: variable 'ansible_timeout' from source: unknown 28983 1726883030.28896: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883030.29041: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883030.29044: variable 'omit' from source: magic vars 28983 1726883030.29049: starting attempt loop 28983 1726883030.29055: running the handler 28983 1726883030.29057: variable 'lsr_test' from source: include params 28983 1726883030.29141: variable 'lsr_test' from source: include params 28983 1726883030.29144: handler run complete 28983 1726883030.29147: attempt loop complete, returning result 28983 1726883030.29154: variable 'item' from source: unknown 28983 1726883030.29272: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_test) => { "ansible_loop_var": "item", "item": "lsr_test", "lsr_test": [ "tasks/remove_profile.yml" ] } 28983 1726883030.29443: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883030.29447: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883030.29450: variable 'omit' from source: magic vars 28983 1726883030.29686: variable 'ansible_distribution_major_version' from source: facts 28983 1726883030.29690: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883030.29692: variable 'omit' from source: magic vars 28983 1726883030.29695: variable 'omit' from source: magic vars 28983 1726883030.29697: variable 'item' from source: unknown 28983 1726883030.29700: variable 'item' from source: unknown 28983 1726883030.29702: variable 'omit' from source: magic vars 28983 1726883030.29771: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883030.29775: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883030.29778: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883030.29781: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883030.29783: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883030.29785: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883030.29829: Set connection var ansible_connection to ssh 28983 1726883030.29841: Set connection var ansible_shell_executable to /bin/sh 28983 1726883030.29852: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883030.29862: Set connection var ansible_timeout to 10 28983 1726883030.29869: Set connection var ansible_pipelining to False 28983 1726883030.29878: Set connection var ansible_shell_type to sh 28983 1726883030.29895: variable 'ansible_shell_executable' from source: unknown 28983 1726883030.29903: variable 'ansible_connection' from source: unknown 28983 1726883030.29905: variable 'ansible_module_compression' from source: unknown 28983 1726883030.29908: variable 'ansible_shell_type' from source: unknown 28983 1726883030.29910: variable 'ansible_shell_executable' from source: unknown 28983 1726883030.29912: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883030.29989: variable 'ansible_pipelining' from source: unknown 28983 1726883030.29992: variable 'ansible_timeout' from source: unknown 28983 1726883030.29995: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883030.30025: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883030.30035: variable 'omit' from source: magic vars 28983 1726883030.30042: starting attempt loop 28983 1726883030.30045: running the handler 28983 1726883030.30121: variable 'lsr_assert' from source: include params 28983 1726883030.30141: variable 'lsr_assert' from source: include params 28983 1726883030.30161: handler run complete 28983 1726883030.30229: attempt loop complete, returning result 28983 1726883030.30232: variable 'item' from source: unknown 28983 1726883030.30317: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_assert) => { "ansible_loop_var": "item", "item": "lsr_assert", "lsr_assert": [ "tasks/assert_device_present.yml", "tasks/assert_profile_absent.yml" ] } 28983 1726883030.30388: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883030.30392: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883030.30395: variable 'omit' from source: magic vars 28983 1726883030.30656: variable 'ansible_distribution_major_version' from source: facts 28983 1726883030.30662: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883030.30665: variable 'omit' from source: magic vars 28983 1726883030.30667: variable 'omit' from source: magic vars 28983 1726883030.30714: variable 'item' from source: unknown 28983 1726883030.30774: variable 'item' from source: unknown 28983 1726883030.30873: variable 'omit' from source: magic vars 28983 1726883030.30877: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883030.30879: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883030.30883: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883030.30885: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883030.30887: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883030.30890: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883030.30931: Set connection var ansible_connection to ssh 28983 1726883030.30947: Set connection var ansible_shell_executable to /bin/sh 28983 1726883030.30955: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883030.30964: Set connection var ansible_timeout to 10 28983 1726883030.30970: Set connection var ansible_pipelining to False 28983 1726883030.30984: Set connection var ansible_shell_type to sh 28983 1726883030.30996: variable 'ansible_shell_executable' from source: unknown 28983 1726883030.30999: variable 'ansible_connection' from source: unknown 28983 1726883030.31003: variable 'ansible_module_compression' from source: unknown 28983 1726883030.31006: variable 'ansible_shell_type' from source: unknown 28983 1726883030.31010: variable 'ansible_shell_executable' from source: unknown 28983 1726883030.31093: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883030.31100: variable 'ansible_pipelining' from source: unknown 28983 1726883030.31103: variable 'ansible_timeout' from source: unknown 28983 1726883030.31105: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883030.31180: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883030.31183: variable 'omit' from source: magic vars 28983 1726883030.31186: starting attempt loop 28983 1726883030.31188: running the handler 28983 1726883030.31272: handler run complete 28983 1726883030.31308: attempt loop complete, returning result 28983 1726883030.31311: variable 'item' from source: unknown 28983 1726883030.31382: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_assert_when) => { "ansible_loop_var": "item", "item": "lsr_assert_when", "lsr_assert_when": "VARIABLE IS NOT DEFINED!: 'lsr_assert_when' is undefined" } 28983 1726883030.31577: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883030.31581: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883030.31584: variable 'omit' from source: magic vars 28983 1726883030.31689: variable 'ansible_distribution_major_version' from source: facts 28983 1726883030.31696: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883030.31744: variable 'omit' from source: magic vars 28983 1726883030.31748: variable 'omit' from source: magic vars 28983 1726883030.31766: variable 'item' from source: unknown 28983 1726883030.31842: variable 'item' from source: unknown 28983 1726883030.31859: variable 'omit' from source: magic vars 28983 1726883030.31883: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883030.31904: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883030.31907: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883030.31910: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883030.31915: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883030.31962: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883030.32005: Set connection var ansible_connection to ssh 28983 1726883030.32017: Set connection var ansible_shell_executable to /bin/sh 28983 1726883030.32027: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883030.32040: Set connection var ansible_timeout to 10 28983 1726883030.32047: Set connection var ansible_pipelining to False 28983 1726883030.32050: Set connection var ansible_shell_type to sh 28983 1726883030.32124: variable 'ansible_shell_executable' from source: unknown 28983 1726883030.32127: variable 'ansible_connection' from source: unknown 28983 1726883030.32130: variable 'ansible_module_compression' from source: unknown 28983 1726883030.32132: variable 'ansible_shell_type' from source: unknown 28983 1726883030.32137: variable 'ansible_shell_executable' from source: unknown 28983 1726883030.32140: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883030.32142: variable 'ansible_pipelining' from source: unknown 28983 1726883030.32145: variable 'ansible_timeout' from source: unknown 28983 1726883030.32147: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883030.32214: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883030.32223: variable 'omit' from source: magic vars 28983 1726883030.32231: starting attempt loop 28983 1726883030.32234: running the handler 28983 1726883030.32307: variable 'lsr_fail_debug' from source: play vars 28983 1726883030.32329: variable 'lsr_fail_debug' from source: play vars 28983 1726883030.32350: handler run complete 28983 1726883030.32369: attempt loop complete, returning result 28983 1726883030.32401: variable 'item' from source: unknown 28983 1726883030.32469: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_fail_debug) => { "ansible_loop_var": "item", "item": "lsr_fail_debug", "lsr_fail_debug": [ "__network_connections_result" ] } 28983 1726883030.32593: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883030.32596: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883030.32599: variable 'omit' from source: magic vars 28983 1726883030.32713: variable 'ansible_distribution_major_version' from source: facts 28983 1726883030.32724: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883030.32727: variable 'omit' from source: magic vars 28983 1726883030.32739: variable 'omit' from source: magic vars 28983 1726883030.32771: variable 'item' from source: unknown 28983 1726883030.32828: variable 'item' from source: unknown 28983 1726883030.32844: variable 'omit' from source: magic vars 28983 1726883030.32859: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883030.32866: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883030.32873: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883030.32886: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883030.32889: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883030.32894: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883030.32951: Set connection var ansible_connection to ssh 28983 1726883030.32960: Set connection var ansible_shell_executable to /bin/sh 28983 1726883030.32969: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883030.32980: Set connection var ansible_timeout to 10 28983 1726883030.32986: Set connection var ansible_pipelining to False 28983 1726883030.32988: Set connection var ansible_shell_type to sh 28983 1726883030.33004: variable 'ansible_shell_executable' from source: unknown 28983 1726883030.33007: variable 'ansible_connection' from source: unknown 28983 1726883030.33010: variable 'ansible_module_compression' from source: unknown 28983 1726883030.33014: variable 'ansible_shell_type' from source: unknown 28983 1726883030.33017: variable 'ansible_shell_executable' from source: unknown 28983 1726883030.33023: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883030.33025: variable 'ansible_pipelining' from source: unknown 28983 1726883030.33030: variable 'ansible_timeout' from source: unknown 28983 1726883030.33037: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883030.33108: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883030.33115: variable 'omit' from source: magic vars 28983 1726883030.33120: starting attempt loop 28983 1726883030.33123: running the handler 28983 1726883030.33141: variable 'lsr_cleanup' from source: include params 28983 1726883030.33197: variable 'lsr_cleanup' from source: include params 28983 1726883030.33210: handler run complete 28983 1726883030.33222: attempt loop complete, returning result 28983 1726883030.33237: variable 'item' from source: unknown 28983 1726883030.33293: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_cleanup) => { "ansible_loop_var": "item", "item": "lsr_cleanup", "lsr_cleanup": [ "tasks/cleanup_profile+device.yml" ] } 28983 1726883030.33375: dumping result to json 28983 1726883030.33378: done dumping result, returning 28983 1726883030.33382: done running TaskExecutor() for managed_node2/TASK: Show item [0affe814-3a2d-b16d-c0a7-000000001006] 28983 1726883030.33385: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001006 28983 1726883030.33431: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001006 28983 1726883030.33436: WORKER PROCESS EXITING 28983 1726883030.33500: no more pending results, returning what we have 28983 1726883030.33503: results queue empty 28983 1726883030.33504: checking for any_errors_fatal 28983 1726883030.33510: done checking for any_errors_fatal 28983 1726883030.33511: checking for max_fail_percentage 28983 1726883030.33513: done checking for max_fail_percentage 28983 1726883030.33514: checking to see if all hosts have failed and the running result is not ok 28983 1726883030.33515: done checking to see if all hosts have failed 28983 1726883030.33516: getting the remaining hosts for this loop 28983 1726883030.33518: done getting the remaining hosts for this loop 28983 1726883030.33522: getting the next task for host managed_node2 28983 1726883030.33529: done getting next task for host managed_node2 28983 1726883030.33532: ^ task is: TASK: Include the task 'show_interfaces.yml' 28983 1726883030.33538: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883030.33543: getting variables 28983 1726883030.33544: in VariableManager get_vars() 28983 1726883030.33582: Calling all_inventory to load vars for managed_node2 28983 1726883030.33585: Calling groups_inventory to load vars for managed_node2 28983 1726883030.33589: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883030.33600: Calling all_plugins_play to load vars for managed_node2 28983 1726883030.33603: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883030.33606: Calling groups_plugins_play to load vars for managed_node2 28983 1726883030.35344: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883030.37080: done with get_vars() 28983 1726883030.37103: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:21 Friday 20 September 2024 21:43:50 -0400 (0:00:00.129) 0:01:00.369 ****** 28983 1726883030.37181: entering _queue_task() for managed_node2/include_tasks 28983 1726883030.37422: worker is 1 (out of 1 available) 28983 1726883030.37437: exiting _queue_task() for managed_node2/include_tasks 28983 1726883030.37451: done queuing things up, now waiting for results queue to drain 28983 1726883030.37453: waiting for pending results... 28983 1726883030.37647: running TaskExecutor() for managed_node2/TASK: Include the task 'show_interfaces.yml' 28983 1726883030.37736: in run() - task 0affe814-3a2d-b16d-c0a7-000000001007 28983 1726883030.37748: variable 'ansible_search_path' from source: unknown 28983 1726883030.37752: variable 'ansible_search_path' from source: unknown 28983 1726883030.37787: calling self._execute() 28983 1726883030.37873: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883030.37877: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883030.37889: variable 'omit' from source: magic vars 28983 1726883030.38477: variable 'ansible_distribution_major_version' from source: facts 28983 1726883030.38484: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883030.38489: _execute() done 28983 1726883030.38493: dumping result to json 28983 1726883030.38496: done dumping result, returning 28983 1726883030.38498: done running TaskExecutor() for managed_node2/TASK: Include the task 'show_interfaces.yml' [0affe814-3a2d-b16d-c0a7-000000001007] 28983 1726883030.38501: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001007 28983 1726883030.38607: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001007 28983 1726883030.38614: WORKER PROCESS EXITING 28983 1726883030.38659: no more pending results, returning what we have 28983 1726883030.38670: in VariableManager get_vars() 28983 1726883030.38730: Calling all_inventory to load vars for managed_node2 28983 1726883030.38737: Calling groups_inventory to load vars for managed_node2 28983 1726883030.38741: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883030.38759: Calling all_plugins_play to load vars for managed_node2 28983 1726883030.38763: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883030.38767: Calling groups_plugins_play to load vars for managed_node2 28983 1726883030.40810: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883030.43238: done with get_vars() 28983 1726883030.43261: variable 'ansible_search_path' from source: unknown 28983 1726883030.43262: variable 'ansible_search_path' from source: unknown 28983 1726883030.43295: we have included files to process 28983 1726883030.43296: generating all_blocks data 28983 1726883030.43298: done generating all_blocks data 28983 1726883030.43302: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 28983 1726883030.43303: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 28983 1726883030.43304: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 28983 1726883030.43390: in VariableManager get_vars() 28983 1726883030.43407: done with get_vars() 28983 1726883030.43503: done processing included file 28983 1726883030.43504: iterating over new_blocks loaded from include file 28983 1726883030.43505: in VariableManager get_vars() 28983 1726883030.43517: done with get_vars() 28983 1726883030.43518: filtering new block on tags 28983 1726883030.43549: done filtering new block on tags 28983 1726883030.43551: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node2 28983 1726883030.43555: extending task lists for all hosts with included blocks 28983 1726883030.43965: done extending task lists 28983 1726883030.43966: done processing included files 28983 1726883030.43967: results queue empty 28983 1726883030.43968: checking for any_errors_fatal 28983 1726883030.43977: done checking for any_errors_fatal 28983 1726883030.43977: checking for max_fail_percentage 28983 1726883030.43978: done checking for max_fail_percentage 28983 1726883030.43979: checking to see if all hosts have failed and the running result is not ok 28983 1726883030.43979: done checking to see if all hosts have failed 28983 1726883030.43980: getting the remaining hosts for this loop 28983 1726883030.43981: done getting the remaining hosts for this loop 28983 1726883030.43983: getting the next task for host managed_node2 28983 1726883030.43987: done getting next task for host managed_node2 28983 1726883030.43988: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 28983 1726883030.43991: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883030.43993: getting variables 28983 1726883030.43994: in VariableManager get_vars() 28983 1726883030.44002: Calling all_inventory to load vars for managed_node2 28983 1726883030.44004: Calling groups_inventory to load vars for managed_node2 28983 1726883030.44006: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883030.44011: Calling all_plugins_play to load vars for managed_node2 28983 1726883030.44013: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883030.44015: Calling groups_plugins_play to load vars for managed_node2 28983 1726883030.45488: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883030.49591: done with get_vars() 28983 1726883030.49723: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 21:43:50 -0400 (0:00:00.126) 0:01:00.496 ****** 28983 1726883030.49828: entering _queue_task() for managed_node2/include_tasks 28983 1726883030.50294: worker is 1 (out of 1 available) 28983 1726883030.50307: exiting _queue_task() for managed_node2/include_tasks 28983 1726883030.50323: done queuing things up, now waiting for results queue to drain 28983 1726883030.50325: waiting for pending results... 28983 1726883030.50787: running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' 28983 1726883030.50796: in run() - task 0affe814-3a2d-b16d-c0a7-00000000102e 28983 1726883030.50878: variable 'ansible_search_path' from source: unknown 28983 1726883030.50884: variable 'ansible_search_path' from source: unknown 28983 1726883030.50887: calling self._execute() 28983 1726883030.51120: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883030.51125: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883030.51128: variable 'omit' from source: magic vars 28983 1726883030.51641: variable 'ansible_distribution_major_version' from source: facts 28983 1726883030.51645: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883030.51648: _execute() done 28983 1726883030.51651: dumping result to json 28983 1726883030.51654: done dumping result, returning 28983 1726883030.51656: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' [0affe814-3a2d-b16d-c0a7-00000000102e] 28983 1726883030.51658: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000102e 28983 1726883030.51966: no more pending results, returning what we have 28983 1726883030.51971: in VariableManager get_vars() 28983 1726883030.52010: Calling all_inventory to load vars for managed_node2 28983 1726883030.52013: Calling groups_inventory to load vars for managed_node2 28983 1726883030.52017: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883030.52027: Calling all_plugins_play to load vars for managed_node2 28983 1726883030.52031: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883030.52036: Calling groups_plugins_play to load vars for managed_node2 28983 1726883030.52742: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000102e 28983 1726883030.52747: WORKER PROCESS EXITING 28983 1726883030.57422: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883030.63130: done with get_vars() 28983 1726883030.63173: variable 'ansible_search_path' from source: unknown 28983 1726883030.63175: variable 'ansible_search_path' from source: unknown 28983 1726883030.63225: we have included files to process 28983 1726883030.63226: generating all_blocks data 28983 1726883030.63230: done generating all_blocks data 28983 1726883030.63232: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 28983 1726883030.63233: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 28983 1726883030.63325: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 28983 1726883030.63990: done processing included file 28983 1726883030.64063: iterating over new_blocks loaded from include file 28983 1726883030.64067: in VariableManager get_vars() 28983 1726883030.64090: done with get_vars() 28983 1726883030.64094: filtering new block on tags 28983 1726883030.64229: done filtering new block on tags 28983 1726883030.64262: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node2 28983 1726883030.64269: extending task lists for all hosts with included blocks 28983 1726883030.64527: done extending task lists 28983 1726883030.64528: done processing included files 28983 1726883030.64530: results queue empty 28983 1726883030.64530: checking for any_errors_fatal 28983 1726883030.64536: done checking for any_errors_fatal 28983 1726883030.64537: checking for max_fail_percentage 28983 1726883030.64539: done checking for max_fail_percentage 28983 1726883030.64540: checking to see if all hosts have failed and the running result is not ok 28983 1726883030.64541: done checking to see if all hosts have failed 28983 1726883030.64542: getting the remaining hosts for this loop 28983 1726883030.64543: done getting the remaining hosts for this loop 28983 1726883030.64547: getting the next task for host managed_node2 28983 1726883030.64552: done getting next task for host managed_node2 28983 1726883030.64555: ^ task is: TASK: Gather current interface info 28983 1726883030.64559: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883030.64566: getting variables 28983 1726883030.64567: in VariableManager get_vars() 28983 1726883030.64580: Calling all_inventory to load vars for managed_node2 28983 1726883030.64583: Calling groups_inventory to load vars for managed_node2 28983 1726883030.64586: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883030.64592: Calling all_plugins_play to load vars for managed_node2 28983 1726883030.64595: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883030.64599: Calling groups_plugins_play to load vars for managed_node2 28983 1726883030.67690: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883030.72425: done with get_vars() 28983 1726883030.72471: done getting variables 28983 1726883030.72549: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 21:43:50 -0400 (0:00:00.227) 0:01:00.723 ****** 28983 1726883030.72600: entering _queue_task() for managed_node2/command 28983 1726883030.73071: worker is 1 (out of 1 available) 28983 1726883030.73087: exiting _queue_task() for managed_node2/command 28983 1726883030.73106: done queuing things up, now waiting for results queue to drain 28983 1726883030.73110: waiting for pending results... 28983 1726883030.74177: running TaskExecutor() for managed_node2/TASK: Gather current interface info 28983 1726883030.74458: in run() - task 0affe814-3a2d-b16d-c0a7-000000001069 28983 1726883030.74491: variable 'ansible_search_path' from source: unknown 28983 1726883030.74526: variable 'ansible_search_path' from source: unknown 28983 1726883030.74664: calling self._execute() 28983 1726883030.74774: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883030.74796: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883030.74823: variable 'omit' from source: magic vars 28983 1726883030.75332: variable 'ansible_distribution_major_version' from source: facts 28983 1726883030.75359: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883030.75372: variable 'omit' from source: magic vars 28983 1726883030.75436: variable 'omit' from source: magic vars 28983 1726883030.75476: variable 'omit' from source: magic vars 28983 1726883030.75537: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883030.75564: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883030.75597: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883030.75651: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883030.75656: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883030.75671: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883030.75677: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883030.75680: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883030.75782: Set connection var ansible_connection to ssh 28983 1726883030.75790: Set connection var ansible_shell_executable to /bin/sh 28983 1726883030.75842: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883030.75847: Set connection var ansible_timeout to 10 28983 1726883030.75850: Set connection var ansible_pipelining to False 28983 1726883030.75857: Set connection var ansible_shell_type to sh 28983 1726883030.75897: variable 'ansible_shell_executable' from source: unknown 28983 1726883030.75901: variable 'ansible_connection' from source: unknown 28983 1726883030.75905: variable 'ansible_module_compression' from source: unknown 28983 1726883030.75907: variable 'ansible_shell_type' from source: unknown 28983 1726883030.75913: variable 'ansible_shell_executable' from source: unknown 28983 1726883030.75915: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883030.75918: variable 'ansible_pipelining' from source: unknown 28983 1726883030.75920: variable 'ansible_timeout' from source: unknown 28983 1726883030.75922: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883030.76144: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883030.76148: variable 'omit' from source: magic vars 28983 1726883030.76151: starting attempt loop 28983 1726883030.76153: running the handler 28983 1726883030.76164: _low_level_execute_command(): starting 28983 1726883030.76172: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726883030.77133: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883030.77170: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726883030.77177: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883030.77181: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883030.77241: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883030.77297: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883030.77364: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883030.79147: stdout chunk (state=3): >>>/root <<< 28983 1726883030.79255: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883030.79303: stderr chunk (state=3): >>><<< 28983 1726883030.79310: stdout chunk (state=3): >>><<< 28983 1726883030.79333: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883030.79355: _low_level_execute_command(): starting 28983 1726883030.79360: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726883030.793411-31140-117302534539663 `" && echo ansible-tmp-1726883030.793411-31140-117302534539663="` echo /root/.ansible/tmp/ansible-tmp-1726883030.793411-31140-117302534539663 `" ) && sleep 0' 28983 1726883030.79976: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883030.80012: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883030.80115: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883030.82157: stdout chunk (state=3): >>>ansible-tmp-1726883030.793411-31140-117302534539663=/root/.ansible/tmp/ansible-tmp-1726883030.793411-31140-117302534539663 <<< 28983 1726883030.82301: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883030.82316: stdout chunk (state=3): >>><<< 28983 1726883030.82319: stderr chunk (state=3): >>><<< 28983 1726883030.82341: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726883030.793411-31140-117302534539663=/root/.ansible/tmp/ansible-tmp-1726883030.793411-31140-117302534539663 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883030.82362: variable 'ansible_module_compression' from source: unknown 28983 1726883030.82403: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 28983 1726883030.82432: variable 'ansible_facts' from source: unknown 28983 1726883030.82497: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726883030.793411-31140-117302534539663/AnsiballZ_command.py 28983 1726883030.82604: Sending initial data 28983 1726883030.82607: Sent initial data (155 bytes) 28983 1726883030.83008: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883030.83036: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883030.83040: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883030.83043: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883030.83097: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883030.83102: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883030.83189: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883030.84799: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 28983 1726883030.84803: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726883030.84861: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726883030.84923: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmp_scbqrrq /root/.ansible/tmp/ansible-tmp-1726883030.793411-31140-117302534539663/AnsiballZ_command.py <<< 28983 1726883030.84933: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726883030.793411-31140-117302534539663/AnsiballZ_command.py" <<< 28983 1726883030.84995: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmp_scbqrrq" to remote "/root/.ansible/tmp/ansible-tmp-1726883030.793411-31140-117302534539663/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726883030.793411-31140-117302534539663/AnsiballZ_command.py" <<< 28983 1726883030.85890: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883030.85940: stderr chunk (state=3): >>><<< 28983 1726883030.85944: stdout chunk (state=3): >>><<< 28983 1726883030.85961: done transferring module to remote 28983 1726883030.85976: _low_level_execute_command(): starting 28983 1726883030.85981: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726883030.793411-31140-117302534539663/ /root/.ansible/tmp/ansible-tmp-1726883030.793411-31140-117302534539663/AnsiballZ_command.py && sleep 0' 28983 1726883030.86413: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883030.86418: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883030.86421: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883030.86423: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883030.86426: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883030.86474: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883030.86477: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883030.86554: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883030.88487: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883030.88508: stderr chunk (state=3): >>><<< 28983 1726883030.88511: stdout chunk (state=3): >>><<< 28983 1726883030.88525: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883030.88529: _low_level_execute_command(): starting 28983 1726883030.88536: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726883030.793411-31140-117302534539663/AnsiballZ_command.py && sleep 0' 28983 1726883030.88960: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883030.88964: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883030.88967: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration <<< 28983 1726883030.88969: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found <<< 28983 1726883030.88971: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883030.89028: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883030.89032: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883030.89106: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883031.06919: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:43:51.064418", "end": "2024-09-20 21:43:51.068116", "delta": "0:00:00.003698", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 28983 1726883031.08740: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. <<< 28983 1726883031.08743: stderr chunk (state=3): >>><<< 28983 1726883031.08746: stdout chunk (state=3): >>><<< 28983 1726883031.08749: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:43:51.064418", "end": "2024-09-20 21:43:51.068116", "delta": "0:00:00.003698", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726883031.08804: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726883030.793411-31140-117302534539663/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726883031.08813: _low_level_execute_command(): starting 28983 1726883031.08819: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726883030.793411-31140-117302534539663/ > /dev/null 2>&1 && sleep 0' 28983 1726883031.09443: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883031.09453: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883031.09465: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883031.09491: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883031.09505: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726883031.09513: stderr chunk (state=3): >>>debug2: match not found <<< 28983 1726883031.09525: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883031.09543: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28983 1726883031.09552: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.46.139 is address <<< 28983 1726883031.09559: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28983 1726883031.09739: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883031.09742: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883031.09750: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883031.09752: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883031.09759: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883031.09761: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883031.09763: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883031.09827: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883031.11889: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883031.11893: stdout chunk (state=3): >>><<< 28983 1726883031.11896: stderr chunk (state=3): >>><<< 28983 1726883031.12040: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883031.12044: handler run complete 28983 1726883031.12046: Evaluated conditional (False): False 28983 1726883031.12049: attempt loop complete, returning result 28983 1726883031.12051: _execute() done 28983 1726883031.12053: dumping result to json 28983 1726883031.12056: done dumping result, returning 28983 1726883031.12058: done running TaskExecutor() for managed_node2/TASK: Gather current interface info [0affe814-3a2d-b16d-c0a7-000000001069] 28983 1726883031.12060: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001069 28983 1726883031.12355: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001069 28983 1726883031.12359: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003698", "end": "2024-09-20 21:43:51.068116", "rc": 0, "start": "2024-09-20 21:43:51.064418" } STDOUT: bonding_masters eth0 lo 28983 1726883031.12494: no more pending results, returning what we have 28983 1726883031.12498: results queue empty 28983 1726883031.12499: checking for any_errors_fatal 28983 1726883031.12502: done checking for any_errors_fatal 28983 1726883031.12503: checking for max_fail_percentage 28983 1726883031.12505: done checking for max_fail_percentage 28983 1726883031.12506: checking to see if all hosts have failed and the running result is not ok 28983 1726883031.12507: done checking to see if all hosts have failed 28983 1726883031.12508: getting the remaining hosts for this loop 28983 1726883031.12511: done getting the remaining hosts for this loop 28983 1726883031.12517: getting the next task for host managed_node2 28983 1726883031.12527: done getting next task for host managed_node2 28983 1726883031.12531: ^ task is: TASK: Set current_interfaces 28983 1726883031.12604: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883031.12613: getting variables 28983 1726883031.12615: in VariableManager get_vars() 28983 1726883031.12737: Calling all_inventory to load vars for managed_node2 28983 1726883031.12741: Calling groups_inventory to load vars for managed_node2 28983 1726883031.12746: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883031.12825: Calling all_plugins_play to load vars for managed_node2 28983 1726883031.12830: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883031.12902: Calling groups_plugins_play to load vars for managed_node2 28983 1726883031.15728: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883031.18733: done with get_vars() 28983 1726883031.18771: done getting variables 28983 1726883031.18839: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 21:43:51 -0400 (0:00:00.462) 0:01:01.186 ****** 28983 1726883031.18877: entering _queue_task() for managed_node2/set_fact 28983 1726883031.19190: worker is 1 (out of 1 available) 28983 1726883031.19203: exiting _queue_task() for managed_node2/set_fact 28983 1726883031.19218: done queuing things up, now waiting for results queue to drain 28983 1726883031.19219: waiting for pending results... 28983 1726883031.19565: running TaskExecutor() for managed_node2/TASK: Set current_interfaces 28983 1726883031.19742: in run() - task 0affe814-3a2d-b16d-c0a7-00000000106a 28983 1726883031.19747: variable 'ansible_search_path' from source: unknown 28983 1726883031.19751: variable 'ansible_search_path' from source: unknown 28983 1726883031.19753: calling self._execute() 28983 1726883031.19821: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883031.19829: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883031.19844: variable 'omit' from source: magic vars 28983 1726883031.20278: variable 'ansible_distribution_major_version' from source: facts 28983 1726883031.20290: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883031.20297: variable 'omit' from source: magic vars 28983 1726883031.20506: variable 'omit' from source: magic vars 28983 1726883031.20509: variable '_current_interfaces' from source: set_fact 28983 1726883031.20578: variable 'omit' from source: magic vars 28983 1726883031.20627: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883031.20672: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883031.20696: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883031.20718: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883031.20759: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883031.20775: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883031.20778: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883031.20781: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883031.20910: Set connection var ansible_connection to ssh 28983 1726883031.20925: Set connection var ansible_shell_executable to /bin/sh 28983 1726883031.20938: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883031.20949: Set connection var ansible_timeout to 10 28983 1726883031.20961: Set connection var ansible_pipelining to False 28983 1726883031.20964: Set connection var ansible_shell_type to sh 28983 1726883031.21007: variable 'ansible_shell_executable' from source: unknown 28983 1726883031.21011: variable 'ansible_connection' from source: unknown 28983 1726883031.21014: variable 'ansible_module_compression' from source: unknown 28983 1726883031.21016: variable 'ansible_shell_type' from source: unknown 28983 1726883031.21019: variable 'ansible_shell_executable' from source: unknown 28983 1726883031.21021: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883031.21023: variable 'ansible_pipelining' from source: unknown 28983 1726883031.21025: variable 'ansible_timeout' from source: unknown 28983 1726883031.21027: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883031.21226: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883031.21230: variable 'omit' from source: magic vars 28983 1726883031.21233: starting attempt loop 28983 1726883031.21292: running the handler 28983 1726883031.21296: handler run complete 28983 1726883031.21298: attempt loop complete, returning result 28983 1726883031.21301: _execute() done 28983 1726883031.21303: dumping result to json 28983 1726883031.21305: done dumping result, returning 28983 1726883031.21308: done running TaskExecutor() for managed_node2/TASK: Set current_interfaces [0affe814-3a2d-b16d-c0a7-00000000106a] 28983 1726883031.21311: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000106a 28983 1726883031.21402: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000106a 28983 1726883031.21407: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 28983 1726883031.21478: no more pending results, returning what we have 28983 1726883031.21482: results queue empty 28983 1726883031.21483: checking for any_errors_fatal 28983 1726883031.21495: done checking for any_errors_fatal 28983 1726883031.21496: checking for max_fail_percentage 28983 1726883031.21498: done checking for max_fail_percentage 28983 1726883031.21499: checking to see if all hosts have failed and the running result is not ok 28983 1726883031.21500: done checking to see if all hosts have failed 28983 1726883031.21501: getting the remaining hosts for this loop 28983 1726883031.21504: done getting the remaining hosts for this loop 28983 1726883031.21510: getting the next task for host managed_node2 28983 1726883031.21521: done getting next task for host managed_node2 28983 1726883031.21524: ^ task is: TASK: Show current_interfaces 28983 1726883031.21530: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883031.21538: getting variables 28983 1726883031.21539: in VariableManager get_vars() 28983 1726883031.21583: Calling all_inventory to load vars for managed_node2 28983 1726883031.21587: Calling groups_inventory to load vars for managed_node2 28983 1726883031.21592: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883031.21604: Calling all_plugins_play to load vars for managed_node2 28983 1726883031.21609: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883031.21613: Calling groups_plugins_play to load vars for managed_node2 28983 1726883031.24098: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883031.27053: done with get_vars() 28983 1726883031.27089: done getting variables 28983 1726883031.27158: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 21:43:51 -0400 (0:00:00.083) 0:01:01.269 ****** 28983 1726883031.27195: entering _queue_task() for managed_node2/debug 28983 1726883031.27515: worker is 1 (out of 1 available) 28983 1726883031.27529: exiting _queue_task() for managed_node2/debug 28983 1726883031.27543: done queuing things up, now waiting for results queue to drain 28983 1726883031.27545: waiting for pending results... 28983 1726883031.27955: running TaskExecutor() for managed_node2/TASK: Show current_interfaces 28983 1726883031.28117: in run() - task 0affe814-3a2d-b16d-c0a7-00000000102f 28983 1726883031.28121: variable 'ansible_search_path' from source: unknown 28983 1726883031.28124: variable 'ansible_search_path' from source: unknown 28983 1726883031.28127: calling self._execute() 28983 1726883031.28153: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883031.28161: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883031.28178: variable 'omit' from source: magic vars 28983 1726883031.28623: variable 'ansible_distribution_major_version' from source: facts 28983 1726883031.28640: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883031.28647: variable 'omit' from source: magic vars 28983 1726883031.28709: variable 'omit' from source: magic vars 28983 1726883031.28828: variable 'current_interfaces' from source: set_fact 28983 1726883031.28866: variable 'omit' from source: magic vars 28983 1726883031.28913: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883031.28959: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883031.28988: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883031.29007: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883031.29098: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883031.29101: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883031.29104: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883031.29106: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883031.29194: Set connection var ansible_connection to ssh 28983 1726883031.29208: Set connection var ansible_shell_executable to /bin/sh 28983 1726883031.29220: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883031.29231: Set connection var ansible_timeout to 10 28983 1726883031.29240: Set connection var ansible_pipelining to False 28983 1726883031.29243: Set connection var ansible_shell_type to sh 28983 1726883031.29272: variable 'ansible_shell_executable' from source: unknown 28983 1726883031.29279: variable 'ansible_connection' from source: unknown 28983 1726883031.29288: variable 'ansible_module_compression' from source: unknown 28983 1726883031.29293: variable 'ansible_shell_type' from source: unknown 28983 1726883031.29295: variable 'ansible_shell_executable' from source: unknown 28983 1726883031.29313: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883031.29316: variable 'ansible_pipelining' from source: unknown 28983 1726883031.29318: variable 'ansible_timeout' from source: unknown 28983 1726883031.29320: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883031.29533: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883031.29539: variable 'omit' from source: magic vars 28983 1726883031.29541: starting attempt loop 28983 1726883031.29544: running the handler 28983 1726883031.29564: handler run complete 28983 1726883031.29581: attempt loop complete, returning result 28983 1726883031.29585: _execute() done 28983 1726883031.29587: dumping result to json 28983 1726883031.29593: done dumping result, returning 28983 1726883031.29603: done running TaskExecutor() for managed_node2/TASK: Show current_interfaces [0affe814-3a2d-b16d-c0a7-00000000102f] 28983 1726883031.29609: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000102f 28983 1726883031.29819: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000102f 28983 1726883031.29822: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 28983 1726883031.29872: no more pending results, returning what we have 28983 1726883031.29876: results queue empty 28983 1726883031.29877: checking for any_errors_fatal 28983 1726883031.29883: done checking for any_errors_fatal 28983 1726883031.29883: checking for max_fail_percentage 28983 1726883031.29885: done checking for max_fail_percentage 28983 1726883031.29886: checking to see if all hosts have failed and the running result is not ok 28983 1726883031.29887: done checking to see if all hosts have failed 28983 1726883031.29888: getting the remaining hosts for this loop 28983 1726883031.29890: done getting the remaining hosts for this loop 28983 1726883031.29894: getting the next task for host managed_node2 28983 1726883031.29901: done getting next task for host managed_node2 28983 1726883031.29904: ^ task is: TASK: Setup 28983 1726883031.29907: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883031.29910: getting variables 28983 1726883031.29912: in VariableManager get_vars() 28983 1726883031.29947: Calling all_inventory to load vars for managed_node2 28983 1726883031.29950: Calling groups_inventory to load vars for managed_node2 28983 1726883031.29954: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883031.29964: Calling all_plugins_play to load vars for managed_node2 28983 1726883031.29968: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883031.29972: Calling groups_plugins_play to load vars for managed_node2 28983 1726883031.32208: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883031.40077: done with get_vars() 28983 1726883031.40118: done getting variables TASK [Setup] ******************************************************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:24 Friday 20 September 2024 21:43:51 -0400 (0:00:00.130) 0:01:01.400 ****** 28983 1726883031.40223: entering _queue_task() for managed_node2/include_tasks 28983 1726883031.40599: worker is 1 (out of 1 available) 28983 1726883031.40613: exiting _queue_task() for managed_node2/include_tasks 28983 1726883031.40627: done queuing things up, now waiting for results queue to drain 28983 1726883031.40629: waiting for pending results... 28983 1726883031.40943: running TaskExecutor() for managed_node2/TASK: Setup 28983 1726883031.41078: in run() - task 0affe814-3a2d-b16d-c0a7-000000001008 28983 1726883031.41084: variable 'ansible_search_path' from source: unknown 28983 1726883031.41088: variable 'ansible_search_path' from source: unknown 28983 1726883031.41187: variable 'lsr_setup' from source: include params 28983 1726883031.41650: variable 'lsr_setup' from source: include params 28983 1726883031.41653: variable 'omit' from source: magic vars 28983 1726883031.41656: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883031.41660: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883031.41663: variable 'omit' from source: magic vars 28983 1726883031.41958: variable 'ansible_distribution_major_version' from source: facts 28983 1726883031.41969: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883031.41979: variable 'item' from source: unknown 28983 1726883031.42055: variable 'item' from source: unknown 28983 1726883031.42092: variable 'item' from source: unknown 28983 1726883031.42172: variable 'item' from source: unknown 28983 1726883031.42325: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883031.42329: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883031.42332: variable 'omit' from source: magic vars 28983 1726883031.42610: variable 'ansible_distribution_major_version' from source: facts 28983 1726883031.42613: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883031.42616: variable 'item' from source: unknown 28983 1726883031.42618: variable 'item' from source: unknown 28983 1726883031.42633: variable 'item' from source: unknown 28983 1726883031.42715: variable 'item' from source: unknown 28983 1726883031.42791: dumping result to json 28983 1726883031.42795: done dumping result, returning 28983 1726883031.42798: done running TaskExecutor() for managed_node2/TASK: Setup [0affe814-3a2d-b16d-c0a7-000000001008] 28983 1726883031.42802: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001008 28983 1726883031.42966: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001008 28983 1726883031.42970: WORKER PROCESS EXITING 28983 1726883031.43006: no more pending results, returning what we have 28983 1726883031.43011: in VariableManager get_vars() 28983 1726883031.43053: Calling all_inventory to load vars for managed_node2 28983 1726883031.43057: Calling groups_inventory to load vars for managed_node2 28983 1726883031.43061: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883031.43073: Calling all_plugins_play to load vars for managed_node2 28983 1726883031.43077: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883031.43081: Calling groups_plugins_play to load vars for managed_node2 28983 1726883031.45339: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883031.48308: done with get_vars() 28983 1726883031.48341: variable 'ansible_search_path' from source: unknown 28983 1726883031.48343: variable 'ansible_search_path' from source: unknown 28983 1726883031.48387: variable 'ansible_search_path' from source: unknown 28983 1726883031.48389: variable 'ansible_search_path' from source: unknown 28983 1726883031.48420: we have included files to process 28983 1726883031.48421: generating all_blocks data 28983 1726883031.48423: done generating all_blocks data 28983 1726883031.48428: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 28983 1726883031.48430: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 28983 1726883031.48433: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 28983 1726883031.48742: done processing included file 28983 1726883031.48745: iterating over new_blocks loaded from include file 28983 1726883031.48747: in VariableManager get_vars() 28983 1726883031.48767: done with get_vars() 28983 1726883031.48769: filtering new block on tags 28983 1726883031.48815: done filtering new block on tags 28983 1726883031.48818: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml for managed_node2 => (item=tasks/create_bridge_profile.yml) 28983 1726883031.48824: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml 28983 1726883031.48825: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml 28983 1726883031.48829: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml 28983 1726883031.48954: done processing included file 28983 1726883031.48956: iterating over new_blocks loaded from include file 28983 1726883031.48958: in VariableManager get_vars() 28983 1726883031.48976: done with get_vars() 28983 1726883031.48978: filtering new block on tags 28983 1726883031.49007: done filtering new block on tags 28983 1726883031.49010: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml for managed_node2 => (item=tasks/activate_profile.yml) 28983 1726883031.49014: extending task lists for all hosts with included blocks 28983 1726883031.49887: done extending task lists 28983 1726883031.49889: done processing included files 28983 1726883031.49890: results queue empty 28983 1726883031.49891: checking for any_errors_fatal 28983 1726883031.49895: done checking for any_errors_fatal 28983 1726883031.49896: checking for max_fail_percentage 28983 1726883031.49898: done checking for max_fail_percentage 28983 1726883031.49899: checking to see if all hosts have failed and the running result is not ok 28983 1726883031.49900: done checking to see if all hosts have failed 28983 1726883031.49901: getting the remaining hosts for this loop 28983 1726883031.49902: done getting the remaining hosts for this loop 28983 1726883031.49906: getting the next task for host managed_node2 28983 1726883031.49911: done getting next task for host managed_node2 28983 1726883031.49914: ^ task is: TASK: Include network role 28983 1726883031.49917: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883031.49921: getting variables 28983 1726883031.49922: in VariableManager get_vars() 28983 1726883031.49935: Calling all_inventory to load vars for managed_node2 28983 1726883031.49945: Calling groups_inventory to load vars for managed_node2 28983 1726883031.49948: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883031.49954: Calling all_plugins_play to load vars for managed_node2 28983 1726883031.49957: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883031.49961: Calling groups_plugins_play to load vars for managed_node2 28983 1726883031.52010: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883031.54960: done with get_vars() 28983 1726883031.54992: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml:3 Friday 20 September 2024 21:43:51 -0400 (0:00:00.148) 0:01:01.548 ****** 28983 1726883031.55084: entering _queue_task() for managed_node2/include_role 28983 1726883031.55446: worker is 1 (out of 1 available) 28983 1726883031.55460: exiting _queue_task() for managed_node2/include_role 28983 1726883031.55475: done queuing things up, now waiting for results queue to drain 28983 1726883031.55477: waiting for pending results... 28983 1726883031.55752: running TaskExecutor() for managed_node2/TASK: Include network role 28983 1726883031.55923: in run() - task 0affe814-3a2d-b16d-c0a7-00000000108f 28983 1726883031.56031: variable 'ansible_search_path' from source: unknown 28983 1726883031.56036: variable 'ansible_search_path' from source: unknown 28983 1726883031.56040: calling self._execute() 28983 1726883031.56110: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883031.56124: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883031.56143: variable 'omit' from source: magic vars 28983 1726883031.56625: variable 'ansible_distribution_major_version' from source: facts 28983 1726883031.56647: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883031.56658: _execute() done 28983 1726883031.56666: dumping result to json 28983 1726883031.56677: done dumping result, returning 28983 1726883031.56688: done running TaskExecutor() for managed_node2/TASK: Include network role [0affe814-3a2d-b16d-c0a7-00000000108f] 28983 1726883031.56840: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000108f 28983 1726883031.56932: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000108f 28983 1726883031.56938: WORKER PROCESS EXITING 28983 1726883031.56970: no more pending results, returning what we have 28983 1726883031.56976: in VariableManager get_vars() 28983 1726883031.57017: Calling all_inventory to load vars for managed_node2 28983 1726883031.57021: Calling groups_inventory to load vars for managed_node2 28983 1726883031.57025: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883031.57040: Calling all_plugins_play to load vars for managed_node2 28983 1726883031.57044: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883031.57048: Calling groups_plugins_play to load vars for managed_node2 28983 1726883031.59511: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883031.64471: done with get_vars() 28983 1726883031.64513: variable 'ansible_search_path' from source: unknown 28983 1726883031.64515: variable 'ansible_search_path' from source: unknown 28983 1726883031.64795: variable 'omit' from source: magic vars 28983 1726883031.64857: variable 'omit' from source: magic vars 28983 1726883031.64884: variable 'omit' from source: magic vars 28983 1726883031.64889: we have included files to process 28983 1726883031.64890: generating all_blocks data 28983 1726883031.64893: done generating all_blocks data 28983 1726883031.64894: processing included file: fedora.linux_system_roles.network 28983 1726883031.64922: in VariableManager get_vars() 28983 1726883031.64942: done with get_vars() 28983 1726883031.64983: in VariableManager get_vars() 28983 1726883031.65005: done with get_vars() 28983 1726883031.65061: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 28983 1726883031.65246: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 28983 1726883031.65370: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 28983 1726883031.66064: in VariableManager get_vars() 28983 1726883031.66094: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 28983 1726883031.69066: iterating over new_blocks loaded from include file 28983 1726883031.69069: in VariableManager get_vars() 28983 1726883031.69093: done with get_vars() 28983 1726883031.69095: filtering new block on tags 28983 1726883031.69541: done filtering new block on tags 28983 1726883031.69544: in VariableManager get_vars() 28983 1726883031.69561: done with get_vars() 28983 1726883031.69563: filtering new block on tags 28983 1726883031.69585: done filtering new block on tags 28983 1726883031.69587: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node2 28983 1726883031.69593: extending task lists for all hosts with included blocks 28983 1726883031.69870: done extending task lists 28983 1726883031.69872: done processing included files 28983 1726883031.69876: results queue empty 28983 1726883031.69877: checking for any_errors_fatal 28983 1726883031.69881: done checking for any_errors_fatal 28983 1726883031.69882: checking for max_fail_percentage 28983 1726883031.69884: done checking for max_fail_percentage 28983 1726883031.69885: checking to see if all hosts have failed and the running result is not ok 28983 1726883031.69886: done checking to see if all hosts have failed 28983 1726883031.69887: getting the remaining hosts for this loop 28983 1726883031.69888: done getting the remaining hosts for this loop 28983 1726883031.69892: getting the next task for host managed_node2 28983 1726883031.69898: done getting next task for host managed_node2 28983 1726883031.69901: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 28983 1726883031.69906: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883031.69918: getting variables 28983 1726883031.69919: in VariableManager get_vars() 28983 1726883031.69936: Calling all_inventory to load vars for managed_node2 28983 1726883031.69940: Calling groups_inventory to load vars for managed_node2 28983 1726883031.69942: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883031.69949: Calling all_plugins_play to load vars for managed_node2 28983 1726883031.69952: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883031.69955: Calling groups_plugins_play to load vars for managed_node2 28983 1726883031.72333: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883031.75721: done with get_vars() 28983 1726883031.75756: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:43:51 -0400 (0:00:00.207) 0:01:01.756 ****** 28983 1726883031.75851: entering _queue_task() for managed_node2/include_tasks 28983 1726883031.76232: worker is 1 (out of 1 available) 28983 1726883031.76248: exiting _queue_task() for managed_node2/include_tasks 28983 1726883031.76263: done queuing things up, now waiting for results queue to drain 28983 1726883031.76265: waiting for pending results... 28983 1726883031.76657: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 28983 1726883031.76667: in run() - task 0affe814-3a2d-b16d-c0a7-0000000010f5 28983 1726883031.76685: variable 'ansible_search_path' from source: unknown 28983 1726883031.76689: variable 'ansible_search_path' from source: unknown 28983 1726883031.76725: calling self._execute() 28983 1726883031.76829: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883031.76841: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883031.76859: variable 'omit' from source: magic vars 28983 1726883031.77279: variable 'ansible_distribution_major_version' from source: facts 28983 1726883031.77298: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883031.77337: _execute() done 28983 1726883031.77346: dumping result to json 28983 1726883031.77349: done dumping result, returning 28983 1726883031.77352: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affe814-3a2d-b16d-c0a7-0000000010f5] 28983 1726883031.77355: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000010f5 28983 1726883031.77555: no more pending results, returning what we have 28983 1726883031.77560: in VariableManager get_vars() 28983 1726883031.77603: Calling all_inventory to load vars for managed_node2 28983 1726883031.77607: Calling groups_inventory to load vars for managed_node2 28983 1726883031.77610: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883031.77631: Calling all_plugins_play to load vars for managed_node2 28983 1726883031.77637: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883031.77643: Calling groups_plugins_play to load vars for managed_node2 28983 1726883031.78167: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000010f5 28983 1726883031.78170: WORKER PROCESS EXITING 28983 1726883031.79924: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883031.83368: done with get_vars() 28983 1726883031.83400: variable 'ansible_search_path' from source: unknown 28983 1726883031.83401: variable 'ansible_search_path' from source: unknown 28983 1726883031.83452: we have included files to process 28983 1726883031.83454: generating all_blocks data 28983 1726883031.83456: done generating all_blocks data 28983 1726883031.83460: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 28983 1726883031.83461: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 28983 1726883031.83464: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 28983 1726883031.84202: done processing included file 28983 1726883031.84205: iterating over new_blocks loaded from include file 28983 1726883031.84206: in VariableManager get_vars() 28983 1726883031.84239: done with get_vars() 28983 1726883031.84242: filtering new block on tags 28983 1726883031.84281: done filtering new block on tags 28983 1726883031.84285: in VariableManager get_vars() 28983 1726883031.84312: done with get_vars() 28983 1726883031.84314: filtering new block on tags 28983 1726883031.84375: done filtering new block on tags 28983 1726883031.84379: in VariableManager get_vars() 28983 1726883031.84408: done with get_vars() 28983 1726883031.84410: filtering new block on tags 28983 1726883031.84471: done filtering new block on tags 28983 1726883031.84474: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node2 28983 1726883031.84479: extending task lists for all hosts with included blocks 28983 1726883031.86802: done extending task lists 28983 1726883031.86803: done processing included files 28983 1726883031.86804: results queue empty 28983 1726883031.86805: checking for any_errors_fatal 28983 1726883031.86809: done checking for any_errors_fatal 28983 1726883031.86810: checking for max_fail_percentage 28983 1726883031.86812: done checking for max_fail_percentage 28983 1726883031.86813: checking to see if all hosts have failed and the running result is not ok 28983 1726883031.86814: done checking to see if all hosts have failed 28983 1726883031.86815: getting the remaining hosts for this loop 28983 1726883031.86817: done getting the remaining hosts for this loop 28983 1726883031.86820: getting the next task for host managed_node2 28983 1726883031.86826: done getting next task for host managed_node2 28983 1726883031.86829: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 28983 1726883031.86836: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883031.86848: getting variables 28983 1726883031.86849: in VariableManager get_vars() 28983 1726883031.86864: Calling all_inventory to load vars for managed_node2 28983 1726883031.86867: Calling groups_inventory to load vars for managed_node2 28983 1726883031.86870: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883031.86875: Calling all_plugins_play to load vars for managed_node2 28983 1726883031.86879: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883031.86883: Calling groups_plugins_play to load vars for managed_node2 28983 1726883031.88859: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883031.91741: done with get_vars() 28983 1726883031.91783: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:43:51 -0400 (0:00:00.160) 0:01:01.916 ****** 28983 1726883031.91881: entering _queue_task() for managed_node2/setup 28983 1726883031.92267: worker is 1 (out of 1 available) 28983 1726883031.92280: exiting _queue_task() for managed_node2/setup 28983 1726883031.92295: done queuing things up, now waiting for results queue to drain 28983 1726883031.92297: waiting for pending results... 28983 1726883031.92596: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 28983 1726883031.92775: in run() - task 0affe814-3a2d-b16d-c0a7-000000001152 28983 1726883031.92795: variable 'ansible_search_path' from source: unknown 28983 1726883031.92800: variable 'ansible_search_path' from source: unknown 28983 1726883031.92837: calling self._execute() 28983 1726883031.92952: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883031.92959: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883031.92972: variable 'omit' from source: magic vars 28983 1726883031.93425: variable 'ansible_distribution_major_version' from source: facts 28983 1726883031.93451: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883031.93728: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883031.96450: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883031.96635: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883031.96640: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883031.96699: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883031.96740: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883031.96868: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883031.96875: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883031.96912: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883031.96967: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883031.96985: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883031.97055: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883031.97083: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883031.97113: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883031.97181: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883031.97340: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883031.97417: variable '__network_required_facts' from source: role '' defaults 28983 1726883031.97429: variable 'ansible_facts' from source: unknown 28983 1726883031.98885: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 28983 1726883031.98890: when evaluation is False, skipping this task 28983 1726883031.98900: _execute() done 28983 1726883031.98905: dumping result to json 28983 1726883031.98911: done dumping result, returning 28983 1726883031.98920: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affe814-3a2d-b16d-c0a7-000000001152] 28983 1726883031.98927: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001152 skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28983 1726883031.99157: no more pending results, returning what we have 28983 1726883031.99165: results queue empty 28983 1726883031.99167: checking for any_errors_fatal 28983 1726883031.99171: done checking for any_errors_fatal 28983 1726883031.99172: checking for max_fail_percentage 28983 1726883031.99176: done checking for max_fail_percentage 28983 1726883031.99177: checking to see if all hosts have failed and the running result is not ok 28983 1726883031.99179: done checking to see if all hosts have failed 28983 1726883031.99180: getting the remaining hosts for this loop 28983 1726883031.99182: done getting the remaining hosts for this loop 28983 1726883031.99192: getting the next task for host managed_node2 28983 1726883031.99210: done getting next task for host managed_node2 28983 1726883031.99218: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 28983 1726883031.99232: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883031.99267: getting variables 28983 1726883031.99269: in VariableManager get_vars() 28983 1726883031.99329: Calling all_inventory to load vars for managed_node2 28983 1726883031.99444: Calling groups_inventory to load vars for managed_node2 28983 1726883031.99450: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883031.99462: Calling all_plugins_play to load vars for managed_node2 28983 1726883031.99467: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883031.99472: Calling groups_plugins_play to load vars for managed_node2 28983 1726883031.99997: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001152 28983 1726883032.00007: WORKER PROCESS EXITING 28983 1726883032.04174: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883032.08802: done with get_vars() 28983 1726883032.08846: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:43:52 -0400 (0:00:00.173) 0:01:02.089 ****** 28983 1726883032.09189: entering _queue_task() for managed_node2/stat 28983 1726883032.09662: worker is 1 (out of 1 available) 28983 1726883032.09686: exiting _queue_task() for managed_node2/stat 28983 1726883032.09700: done queuing things up, now waiting for results queue to drain 28983 1726883032.09702: waiting for pending results... 28983 1726883032.10029: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 28983 1726883032.10222: in run() - task 0affe814-3a2d-b16d-c0a7-000000001154 28983 1726883032.10242: variable 'ansible_search_path' from source: unknown 28983 1726883032.10246: variable 'ansible_search_path' from source: unknown 28983 1726883032.10294: calling self._execute() 28983 1726883032.10428: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883032.10440: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883032.10460: variable 'omit' from source: magic vars 28983 1726883032.10907: variable 'ansible_distribution_major_version' from source: facts 28983 1726883032.10918: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883032.11068: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883032.11388: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883032.11480: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883032.11542: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883032.11740: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883032.11744: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883032.11747: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883032.11778: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883032.11816: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883032.11937: variable '__network_is_ostree' from source: set_fact 28983 1726883032.11951: Evaluated conditional (not __network_is_ostree is defined): False 28983 1726883032.11959: when evaluation is False, skipping this task 28983 1726883032.11976: _execute() done 28983 1726883032.11987: dumping result to json 28983 1726883032.11996: done dumping result, returning 28983 1726883032.12009: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affe814-3a2d-b16d-c0a7-000000001154] 28983 1726883032.12023: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001154 skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 28983 1726883032.12350: no more pending results, returning what we have 28983 1726883032.12354: results queue empty 28983 1726883032.12355: checking for any_errors_fatal 28983 1726883032.12362: done checking for any_errors_fatal 28983 1726883032.12363: checking for max_fail_percentage 28983 1726883032.12365: done checking for max_fail_percentage 28983 1726883032.12366: checking to see if all hosts have failed and the running result is not ok 28983 1726883032.12367: done checking to see if all hosts have failed 28983 1726883032.12368: getting the remaining hosts for this loop 28983 1726883032.12370: done getting the remaining hosts for this loop 28983 1726883032.12375: getting the next task for host managed_node2 28983 1726883032.12384: done getting next task for host managed_node2 28983 1726883032.12388: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 28983 1726883032.12395: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883032.12426: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001154 28983 1726883032.12429: WORKER PROCESS EXITING 28983 1726883032.12446: getting variables 28983 1726883032.12458: in VariableManager get_vars() 28983 1726883032.12546: Calling all_inventory to load vars for managed_node2 28983 1726883032.12550: Calling groups_inventory to load vars for managed_node2 28983 1726883032.12559: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883032.12569: Calling all_plugins_play to load vars for managed_node2 28983 1726883032.12572: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883032.12575: Calling groups_plugins_play to load vars for managed_node2 28983 1726883032.14380: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883032.17374: done with get_vars() 28983 1726883032.17399: done getting variables 28983 1726883032.17459: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:43:52 -0400 (0:00:00.083) 0:01:02.172 ****** 28983 1726883032.17505: entering _queue_task() for managed_node2/set_fact 28983 1726883032.17869: worker is 1 (out of 1 available) 28983 1726883032.17884: exiting _queue_task() for managed_node2/set_fact 28983 1726883032.17898: done queuing things up, now waiting for results queue to drain 28983 1726883032.17900: waiting for pending results... 28983 1726883032.18153: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 28983 1726883032.18441: in run() - task 0affe814-3a2d-b16d-c0a7-000000001155 28983 1726883032.18445: variable 'ansible_search_path' from source: unknown 28983 1726883032.18448: variable 'ansible_search_path' from source: unknown 28983 1726883032.18452: calling self._execute() 28983 1726883032.18527: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883032.18548: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883032.18567: variable 'omit' from source: magic vars 28983 1726883032.19011: variable 'ansible_distribution_major_version' from source: facts 28983 1726883032.19021: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883032.19305: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883032.19692: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883032.19757: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883032.19829: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883032.19898: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883032.20201: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883032.20264: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883032.20307: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883032.20384: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883032.20520: variable '__network_is_ostree' from source: set_fact 28983 1726883032.20533: Evaluated conditional (not __network_is_ostree is defined): False 28983 1726883032.20556: when evaluation is False, skipping this task 28983 1726883032.20569: _execute() done 28983 1726883032.20581: dumping result to json 28983 1726883032.20590: done dumping result, returning 28983 1726883032.20653: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affe814-3a2d-b16d-c0a7-000000001155] 28983 1726883032.20661: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001155 28983 1726883032.20741: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001155 28983 1726883032.20745: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 28983 1726883032.20827: no more pending results, returning what we have 28983 1726883032.20831: results queue empty 28983 1726883032.20833: checking for any_errors_fatal 28983 1726883032.20845: done checking for any_errors_fatal 28983 1726883032.20846: checking for max_fail_percentage 28983 1726883032.20848: done checking for max_fail_percentage 28983 1726883032.20849: checking to see if all hosts have failed and the running result is not ok 28983 1726883032.20850: done checking to see if all hosts have failed 28983 1726883032.20851: getting the remaining hosts for this loop 28983 1726883032.20854: done getting the remaining hosts for this loop 28983 1726883032.20860: getting the next task for host managed_node2 28983 1726883032.20883: done getting next task for host managed_node2 28983 1726883032.20888: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 28983 1726883032.20896: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883032.20921: getting variables 28983 1726883032.20924: in VariableManager get_vars() 28983 1726883032.21075: Calling all_inventory to load vars for managed_node2 28983 1726883032.21079: Calling groups_inventory to load vars for managed_node2 28983 1726883032.21099: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883032.21112: Calling all_plugins_play to load vars for managed_node2 28983 1726883032.21117: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883032.21120: Calling groups_plugins_play to load vars for managed_node2 28983 1726883032.23654: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883032.25775: done with get_vars() 28983 1726883032.25801: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:43:52 -0400 (0:00:00.083) 0:01:02.256 ****** 28983 1726883032.25888: entering _queue_task() for managed_node2/service_facts 28983 1726883032.26164: worker is 1 (out of 1 available) 28983 1726883032.26181: exiting _queue_task() for managed_node2/service_facts 28983 1726883032.26196: done queuing things up, now waiting for results queue to drain 28983 1726883032.26198: waiting for pending results... 28983 1726883032.26388: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running 28983 1726883032.26506: in run() - task 0affe814-3a2d-b16d-c0a7-000000001157 28983 1726883032.26521: variable 'ansible_search_path' from source: unknown 28983 1726883032.26524: variable 'ansible_search_path' from source: unknown 28983 1726883032.26562: calling self._execute() 28983 1726883032.26651: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883032.26655: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883032.26667: variable 'omit' from source: magic vars 28983 1726883032.26990: variable 'ansible_distribution_major_version' from source: facts 28983 1726883032.27003: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883032.27009: variable 'omit' from source: magic vars 28983 1726883032.27077: variable 'omit' from source: magic vars 28983 1726883032.27105: variable 'omit' from source: magic vars 28983 1726883032.27143: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883032.27178: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883032.27195: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883032.27214: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883032.27224: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883032.27252: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883032.27257: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883032.27263: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883032.27345: Set connection var ansible_connection to ssh 28983 1726883032.27356: Set connection var ansible_shell_executable to /bin/sh 28983 1726883032.27365: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883032.27375: Set connection var ansible_timeout to 10 28983 1726883032.27381: Set connection var ansible_pipelining to False 28983 1726883032.27384: Set connection var ansible_shell_type to sh 28983 1726883032.27405: variable 'ansible_shell_executable' from source: unknown 28983 1726883032.27409: variable 'ansible_connection' from source: unknown 28983 1726883032.27412: variable 'ansible_module_compression' from source: unknown 28983 1726883032.27415: variable 'ansible_shell_type' from source: unknown 28983 1726883032.27419: variable 'ansible_shell_executable' from source: unknown 28983 1726883032.27421: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883032.27431: variable 'ansible_pipelining' from source: unknown 28983 1726883032.27436: variable 'ansible_timeout' from source: unknown 28983 1726883032.27439: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883032.27599: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28983 1726883032.27609: variable 'omit' from source: magic vars 28983 1726883032.27615: starting attempt loop 28983 1726883032.27618: running the handler 28983 1726883032.27632: _low_level_execute_command(): starting 28983 1726883032.27642: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726883032.28278: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883032.28298: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883032.28313: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883032.28441: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883032.30226: stdout chunk (state=3): >>>/root <<< 28983 1726883032.30410: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883032.30426: stderr chunk (state=3): >>><<< 28983 1726883032.30430: stdout chunk (state=3): >>><<< 28983 1726883032.30469: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883032.30503: _low_level_execute_command(): starting 28983 1726883032.30507: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726883032.3046103-31192-97883173394591 `" && echo ansible-tmp-1726883032.3046103-31192-97883173394591="` echo /root/.ansible/tmp/ansible-tmp-1726883032.3046103-31192-97883173394591 `" ) && sleep 0' 28983 1726883032.31278: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 28983 1726883032.31290: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883032.31293: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883032.31387: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883032.33409: stdout chunk (state=3): >>>ansible-tmp-1726883032.3046103-31192-97883173394591=/root/.ansible/tmp/ansible-tmp-1726883032.3046103-31192-97883173394591 <<< 28983 1726883032.33527: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883032.33572: stderr chunk (state=3): >>><<< 28983 1726883032.33575: stdout chunk (state=3): >>><<< 28983 1726883032.33591: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726883032.3046103-31192-97883173394591=/root/.ansible/tmp/ansible-tmp-1726883032.3046103-31192-97883173394591 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883032.33631: variable 'ansible_module_compression' from source: unknown 28983 1726883032.33673: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 28983 1726883032.33707: variable 'ansible_facts' from source: unknown 28983 1726883032.33771: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726883032.3046103-31192-97883173394591/AnsiballZ_service_facts.py 28983 1726883032.33887: Sending initial data 28983 1726883032.33890: Sent initial data (161 bytes) 28983 1726883032.34323: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883032.34359: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726883032.34363: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883032.34366: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883032.34419: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883032.34422: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883032.34499: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883032.36135: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 28983 1726883032.36148: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726883032.36229: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726883032.36320: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpqzuu9thr /root/.ansible/tmp/ansible-tmp-1726883032.3046103-31192-97883173394591/AnsiballZ_service_facts.py <<< 28983 1726883032.36323: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726883032.3046103-31192-97883173394591/AnsiballZ_service_facts.py" <<< 28983 1726883032.36403: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpqzuu9thr" to remote "/root/.ansible/tmp/ansible-tmp-1726883032.3046103-31192-97883173394591/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726883032.3046103-31192-97883173394591/AnsiballZ_service_facts.py" <<< 28983 1726883032.37739: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883032.37775: stderr chunk (state=3): >>><<< 28983 1726883032.37778: stdout chunk (state=3): >>><<< 28983 1726883032.37796: done transferring module to remote 28983 1726883032.37806: _low_level_execute_command(): starting 28983 1726883032.37811: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726883032.3046103-31192-97883173394591/ /root/.ansible/tmp/ansible-tmp-1726883032.3046103-31192-97883173394591/AnsiballZ_service_facts.py && sleep 0' 28983 1726883032.38374: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883032.38379: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883032.38382: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883032.38384: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883032.38458: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883032.38486: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883032.38584: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883032.40470: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883032.40518: stderr chunk (state=3): >>><<< 28983 1726883032.40525: stdout chunk (state=3): >>><<< 28983 1726883032.40537: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883032.40541: _low_level_execute_command(): starting 28983 1726883032.40547: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726883032.3046103-31192-97883173394591/AnsiballZ_service_facts.py && sleep 0' 28983 1726883032.41019: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883032.41054: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883032.41058: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration <<< 28983 1726883032.41060: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883032.41139: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883032.41144: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883032.41237: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883034.46677: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"network": {"name": "network", "state": "running", "status": "enabled", "source": "sysv"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "import-state.service": {"name": "import-state.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "ip6tables.service": {"name": "ip6tables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "iptables.service": {"name": "iptables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service<<< 28983 1726883034.46735: stdout chunk (state=3): >>>", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "generated", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "sourc<<< 28983 1726883034.46748: stdout chunk (state=3): >>>e": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "loadmodules.service": {"name": "loadmodules.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "stati<<< 28983 1726883034.46753: stdout chunk (state=3): >>>c", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "sys<<< 28983 1726883034.46775: stdout chunk (state=3): >>>temd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 28983 1726883034.48430: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. <<< 28983 1726883034.48497: stderr chunk (state=3): >>><<< 28983 1726883034.48501: stdout chunk (state=3): >>><<< 28983 1726883034.48532: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"network": {"name": "network", "state": "running", "status": "enabled", "source": "sysv"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "import-state.service": {"name": "import-state.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "ip6tables.service": {"name": "ip6tables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "iptables.service": {"name": "iptables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "generated", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "loadmodules.service": {"name": "loadmodules.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726883034.49232: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726883032.3046103-31192-97883173394591/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726883034.49244: _low_level_execute_command(): starting 28983 1726883034.49250: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726883032.3046103-31192-97883173394591/ > /dev/null 2>&1 && sleep 0' 28983 1726883034.49751: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883034.49755: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726883034.49758: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883034.49760: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883034.49762: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883034.49816: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883034.49824: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883034.49898: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883034.51919: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883034.51967: stderr chunk (state=3): >>><<< 28983 1726883034.51971: stdout chunk (state=3): >>><<< 28983 1726883034.51990: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883034.51997: handler run complete 28983 1726883034.52168: variable 'ansible_facts' from source: unknown 28983 1726883034.52315: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883034.52772: variable 'ansible_facts' from source: unknown 28983 1726883034.52903: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883034.53098: attempt loop complete, returning result 28983 1726883034.53108: _execute() done 28983 1726883034.53111: dumping result to json 28983 1726883034.53159: done dumping result, returning 28983 1726883034.53170: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running [0affe814-3a2d-b16d-c0a7-000000001157] 28983 1726883034.53178: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001157 28983 1726883034.54177: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001157 28983 1726883034.54180: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28983 1726883034.54244: no more pending results, returning what we have 28983 1726883034.54247: results queue empty 28983 1726883034.54248: checking for any_errors_fatal 28983 1726883034.54251: done checking for any_errors_fatal 28983 1726883034.54251: checking for max_fail_percentage 28983 1726883034.54253: done checking for max_fail_percentage 28983 1726883034.54253: checking to see if all hosts have failed and the running result is not ok 28983 1726883034.54254: done checking to see if all hosts have failed 28983 1726883034.54255: getting the remaining hosts for this loop 28983 1726883034.54256: done getting the remaining hosts for this loop 28983 1726883034.54259: getting the next task for host managed_node2 28983 1726883034.54266: done getting next task for host managed_node2 28983 1726883034.54270: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 28983 1726883034.54277: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883034.54287: getting variables 28983 1726883034.54288: in VariableManager get_vars() 28983 1726883034.54313: Calling all_inventory to load vars for managed_node2 28983 1726883034.54315: Calling groups_inventory to load vars for managed_node2 28983 1726883034.54316: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883034.54323: Calling all_plugins_play to load vars for managed_node2 28983 1726883034.54325: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883034.54327: Calling groups_plugins_play to load vars for managed_node2 28983 1726883034.55528: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883034.57149: done with get_vars() 28983 1726883034.57174: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:43:54 -0400 (0:00:02.313) 0:01:04.570 ****** 28983 1726883034.57258: entering _queue_task() for managed_node2/package_facts 28983 1726883034.57502: worker is 1 (out of 1 available) 28983 1726883034.57516: exiting _queue_task() for managed_node2/package_facts 28983 1726883034.57531: done queuing things up, now waiting for results queue to drain 28983 1726883034.57533: waiting for pending results... 28983 1726883034.57873: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 28983 1726883034.58053: in run() - task 0affe814-3a2d-b16d-c0a7-000000001158 28983 1726883034.58068: variable 'ansible_search_path' from source: unknown 28983 1726883034.58072: variable 'ansible_search_path' from source: unknown 28983 1726883034.58106: calling self._execute() 28983 1726883034.58193: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883034.58197: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883034.58208: variable 'omit' from source: magic vars 28983 1726883034.58535: variable 'ansible_distribution_major_version' from source: facts 28983 1726883034.58546: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883034.58555: variable 'omit' from source: magic vars 28983 1726883034.58623: variable 'omit' from source: magic vars 28983 1726883034.58651: variable 'omit' from source: magic vars 28983 1726883034.58690: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883034.58721: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883034.58741: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883034.58768: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883034.58782: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883034.58810: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883034.58813: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883034.58818: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883034.58902: Set connection var ansible_connection to ssh 28983 1726883034.58912: Set connection var ansible_shell_executable to /bin/sh 28983 1726883034.58921: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883034.58930: Set connection var ansible_timeout to 10 28983 1726883034.58939: Set connection var ansible_pipelining to False 28983 1726883034.58941: Set connection var ansible_shell_type to sh 28983 1726883034.58964: variable 'ansible_shell_executable' from source: unknown 28983 1726883034.58967: variable 'ansible_connection' from source: unknown 28983 1726883034.58970: variable 'ansible_module_compression' from source: unknown 28983 1726883034.58974: variable 'ansible_shell_type' from source: unknown 28983 1726883034.58978: variable 'ansible_shell_executable' from source: unknown 28983 1726883034.58980: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883034.58985: variable 'ansible_pipelining' from source: unknown 28983 1726883034.58988: variable 'ansible_timeout' from source: unknown 28983 1726883034.58998: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883034.59161: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28983 1726883034.59172: variable 'omit' from source: magic vars 28983 1726883034.59177: starting attempt loop 28983 1726883034.59180: running the handler 28983 1726883034.59194: _low_level_execute_command(): starting 28983 1726883034.59202: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726883034.59751: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883034.59756: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration <<< 28983 1726883034.59760: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883034.59817: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883034.59820: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883034.59930: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883034.61697: stdout chunk (state=3): >>>/root <<< 28983 1726883034.61859: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883034.61897: stderr chunk (state=3): >>><<< 28983 1726883034.61910: stdout chunk (state=3): >>><<< 28983 1726883034.61943: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883034.62059: _low_level_execute_command(): starting 28983 1726883034.62064: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726883034.6195168-31250-75072351998005 `" && echo ansible-tmp-1726883034.6195168-31250-75072351998005="` echo /root/.ansible/tmp/ansible-tmp-1726883034.6195168-31250-75072351998005 `" ) && sleep 0' 28983 1726883034.62739: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883034.62797: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883034.62814: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883034.62833: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883034.62945: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883034.65018: stdout chunk (state=3): >>>ansible-tmp-1726883034.6195168-31250-75072351998005=/root/.ansible/tmp/ansible-tmp-1726883034.6195168-31250-75072351998005 <<< 28983 1726883034.65139: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883034.65183: stderr chunk (state=3): >>><<< 28983 1726883034.65187: stdout chunk (state=3): >>><<< 28983 1726883034.65199: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726883034.6195168-31250-75072351998005=/root/.ansible/tmp/ansible-tmp-1726883034.6195168-31250-75072351998005 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883034.65237: variable 'ansible_module_compression' from source: unknown 28983 1726883034.65283: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 28983 1726883034.65333: variable 'ansible_facts' from source: unknown 28983 1726883034.65486: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726883034.6195168-31250-75072351998005/AnsiballZ_package_facts.py 28983 1726883034.65651: Sending initial data 28983 1726883034.65663: Sent initial data (161 bytes) 28983 1726883034.66395: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883034.66399: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883034.66466: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883034.68160: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 28983 1726883034.68191: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726883034.68268: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726883034.68353: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpvloqth5s /root/.ansible/tmp/ansible-tmp-1726883034.6195168-31250-75072351998005/AnsiballZ_package_facts.py <<< 28983 1726883034.68357: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726883034.6195168-31250-75072351998005/AnsiballZ_package_facts.py" <<< 28983 1726883034.68407: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpvloqth5s" to remote "/root/.ansible/tmp/ansible-tmp-1726883034.6195168-31250-75072351998005/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726883034.6195168-31250-75072351998005/AnsiballZ_package_facts.py" <<< 28983 1726883034.71223: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883034.71227: stderr chunk (state=3): >>><<< 28983 1726883034.71229: stdout chunk (state=3): >>><<< 28983 1726883034.71231: done transferring module to remote 28983 1726883034.71237: _low_level_execute_command(): starting 28983 1726883034.71239: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726883034.6195168-31250-75072351998005/ /root/.ansible/tmp/ansible-tmp-1726883034.6195168-31250-75072351998005/AnsiballZ_package_facts.py && sleep 0' 28983 1726883034.71818: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 28983 1726883034.71833: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883034.71848: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883034.71948: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883034.73910: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883034.73999: stderr chunk (state=3): >>><<< 28983 1726883034.74240: stdout chunk (state=3): >>><<< 28983 1726883034.74266: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883034.74277: _low_level_execute_command(): starting 28983 1726883034.74280: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726883034.6195168-31250-75072351998005/AnsiballZ_package_facts.py && sleep 0' 28983 1726883034.74904: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883034.74917: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883034.74950: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883034.74962: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28983 1726883034.75021: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883034.75069: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883034.75083: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883034.75100: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883034.75214: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883035.39136: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "r<<< 28983 1726883035.39165: stdout chunk (state=3): >>>pm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "relea<<< 28983 1726883035.39179: stdout chunk (state=3): >>>se": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, <<< 28983 1726883035.39217: stdout chunk (state=3): >>>"arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils",<<< 28983 1726883035.39232: stdout chunk (state=3): >>> "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "<<< 28983 1726883035.39252: stdout chunk (state=3): >>>version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.fc3<<< 28983 1726883035.39284: stdout chunk (state=3): >>>9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", <<< 28983 1726883035.39293: stdout chunk (state=3): >>>"release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "sou<<< 28983 1726883035.39319: stdout chunk (state=3): >>>rce": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "a<<< 28983 1726883035.39331: stdout chunk (state=3): >>>spell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch": "n<<< 28983 1726883035.39354: stdout chunk (state=3): >>>oarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_64", <<< 28983 1726883035.39375: stdout chunk (state=3): >>>"source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": null, "a<<< 28983 1726883035.39396: stdout chunk (state=3): >>>rch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "9.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chkconfig": [{"name": "chkconfig", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts": [{"name": "initscripts", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "network-scripts": [{"name": "network-scripts", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 28983 1726883035.41196: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. <<< 28983 1726883035.41262: stderr chunk (state=3): >>><<< 28983 1726883035.41265: stdout chunk (state=3): >>><<< 28983 1726883035.41313: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "9.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chkconfig": [{"name": "chkconfig", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts": [{"name": "initscripts", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "network-scripts": [{"name": "network-scripts", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726883035.43602: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726883034.6195168-31250-75072351998005/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726883035.43624: _low_level_execute_command(): starting 28983 1726883035.43627: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726883034.6195168-31250-75072351998005/ > /dev/null 2>&1 && sleep 0' 28983 1726883035.44114: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883035.44117: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883035.44120: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration <<< 28983 1726883035.44122: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found <<< 28983 1726883035.44125: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883035.44186: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883035.44189: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883035.44256: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883035.46215: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883035.46266: stderr chunk (state=3): >>><<< 28983 1726883035.46269: stdout chunk (state=3): >>><<< 28983 1726883035.46285: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883035.46293: handler run complete 28983 1726883035.47100: variable 'ansible_facts' from source: unknown 28983 1726883035.47595: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883035.49563: variable 'ansible_facts' from source: unknown 28983 1726883035.49991: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883035.50761: attempt loop complete, returning result 28983 1726883035.50778: _execute() done 28983 1726883035.50781: dumping result to json 28983 1726883035.50961: done dumping result, returning 28983 1726883035.50976: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affe814-3a2d-b16d-c0a7-000000001158] 28983 1726883035.50979: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001158 28983 1726883035.53144: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001158 28983 1726883035.53149: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28983 1726883035.53249: no more pending results, returning what we have 28983 1726883035.53251: results queue empty 28983 1726883035.53252: checking for any_errors_fatal 28983 1726883035.53256: done checking for any_errors_fatal 28983 1726883035.53256: checking for max_fail_percentage 28983 1726883035.53257: done checking for max_fail_percentage 28983 1726883035.53258: checking to see if all hosts have failed and the running result is not ok 28983 1726883035.53259: done checking to see if all hosts have failed 28983 1726883035.53259: getting the remaining hosts for this loop 28983 1726883035.53260: done getting the remaining hosts for this loop 28983 1726883035.53264: getting the next task for host managed_node2 28983 1726883035.53270: done getting next task for host managed_node2 28983 1726883035.53272: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 28983 1726883035.53277: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883035.53287: getting variables 28983 1726883035.53288: in VariableManager get_vars() 28983 1726883035.53315: Calling all_inventory to load vars for managed_node2 28983 1726883035.53317: Calling groups_inventory to load vars for managed_node2 28983 1726883035.53319: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883035.53326: Calling all_plugins_play to load vars for managed_node2 28983 1726883035.53328: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883035.53330: Calling groups_plugins_play to load vars for managed_node2 28983 1726883035.54464: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883035.56044: done with get_vars() 28983 1726883035.56067: done getting variables 28983 1726883035.56120: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:43:55 -0400 (0:00:00.988) 0:01:05.559 ****** 28983 1726883035.56155: entering _queue_task() for managed_node2/debug 28983 1726883035.56395: worker is 1 (out of 1 available) 28983 1726883035.56409: exiting _queue_task() for managed_node2/debug 28983 1726883035.56422: done queuing things up, now waiting for results queue to drain 28983 1726883035.56424: waiting for pending results... 28983 1726883035.56625: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 28983 1726883035.56742: in run() - task 0affe814-3a2d-b16d-c0a7-0000000010f6 28983 1726883035.56760: variable 'ansible_search_path' from source: unknown 28983 1726883035.56765: variable 'ansible_search_path' from source: unknown 28983 1726883035.56798: calling self._execute() 28983 1726883035.56887: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883035.56893: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883035.56904: variable 'omit' from source: magic vars 28983 1726883035.57236: variable 'ansible_distribution_major_version' from source: facts 28983 1726883035.57247: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883035.57254: variable 'omit' from source: magic vars 28983 1726883035.57307: variable 'omit' from source: magic vars 28983 1726883035.57391: variable 'network_provider' from source: set_fact 28983 1726883035.57409: variable 'omit' from source: magic vars 28983 1726883035.57449: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883035.57481: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883035.57500: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883035.57517: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883035.57530: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883035.57561: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883035.57565: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883035.57568: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883035.57651: Set connection var ansible_connection to ssh 28983 1726883035.57662: Set connection var ansible_shell_executable to /bin/sh 28983 1726883035.57671: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883035.57682: Set connection var ansible_timeout to 10 28983 1726883035.57689: Set connection var ansible_pipelining to False 28983 1726883035.57692: Set connection var ansible_shell_type to sh 28983 1726883035.57711: variable 'ansible_shell_executable' from source: unknown 28983 1726883035.57715: variable 'ansible_connection' from source: unknown 28983 1726883035.57718: variable 'ansible_module_compression' from source: unknown 28983 1726883035.57721: variable 'ansible_shell_type' from source: unknown 28983 1726883035.57725: variable 'ansible_shell_executable' from source: unknown 28983 1726883035.57729: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883035.57736: variable 'ansible_pipelining' from source: unknown 28983 1726883035.57739: variable 'ansible_timeout' from source: unknown 28983 1726883035.57745: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883035.57862: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883035.57872: variable 'omit' from source: magic vars 28983 1726883035.57880: starting attempt loop 28983 1726883035.57883: running the handler 28983 1726883035.57923: handler run complete 28983 1726883035.57937: attempt loop complete, returning result 28983 1726883035.57940: _execute() done 28983 1726883035.57944: dumping result to json 28983 1726883035.57949: done dumping result, returning 28983 1726883035.57958: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [0affe814-3a2d-b16d-c0a7-0000000010f6] 28983 1726883035.57963: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000010f6 28983 1726883035.58054: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000010f6 28983 1726883035.58057: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: Using network provider: nm 28983 1726883035.58138: no more pending results, returning what we have 28983 1726883035.58142: results queue empty 28983 1726883035.58143: checking for any_errors_fatal 28983 1726883035.58149: done checking for any_errors_fatal 28983 1726883035.58150: checking for max_fail_percentage 28983 1726883035.58152: done checking for max_fail_percentage 28983 1726883035.58153: checking to see if all hosts have failed and the running result is not ok 28983 1726883035.58154: done checking to see if all hosts have failed 28983 1726883035.58155: getting the remaining hosts for this loop 28983 1726883035.58157: done getting the remaining hosts for this loop 28983 1726883035.58161: getting the next task for host managed_node2 28983 1726883035.58168: done getting next task for host managed_node2 28983 1726883035.58172: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 28983 1726883035.58178: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883035.58191: getting variables 28983 1726883035.58192: in VariableManager get_vars() 28983 1726883035.58227: Calling all_inventory to load vars for managed_node2 28983 1726883035.58230: Calling groups_inventory to load vars for managed_node2 28983 1726883035.58233: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883035.58247: Calling all_plugins_play to load vars for managed_node2 28983 1726883035.58250: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883035.58253: Calling groups_plugins_play to load vars for managed_node2 28983 1726883035.59498: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883035.61067: done with get_vars() 28983 1726883035.61090: done getting variables 28983 1726883035.61136: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:43:55 -0400 (0:00:00.050) 0:01:05.609 ****** 28983 1726883035.61169: entering _queue_task() for managed_node2/fail 28983 1726883035.61378: worker is 1 (out of 1 available) 28983 1726883035.61393: exiting _queue_task() for managed_node2/fail 28983 1726883035.61407: done queuing things up, now waiting for results queue to drain 28983 1726883035.61409: waiting for pending results... 28983 1726883035.61599: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 28983 1726883035.61709: in run() - task 0affe814-3a2d-b16d-c0a7-0000000010f7 28983 1726883035.61721: variable 'ansible_search_path' from source: unknown 28983 1726883035.61725: variable 'ansible_search_path' from source: unknown 28983 1726883035.61759: calling self._execute() 28983 1726883035.61842: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883035.61850: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883035.61863: variable 'omit' from source: magic vars 28983 1726883035.62169: variable 'ansible_distribution_major_version' from source: facts 28983 1726883035.62184: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883035.62288: variable 'network_state' from source: role '' defaults 28983 1726883035.62299: Evaluated conditional (network_state != {}): False 28983 1726883035.62306: when evaluation is False, skipping this task 28983 1726883035.62309: _execute() done 28983 1726883035.62312: dumping result to json 28983 1726883035.62318: done dumping result, returning 28983 1726883035.62325: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affe814-3a2d-b16d-c0a7-0000000010f7] 28983 1726883035.62331: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000010f7 28983 1726883035.62429: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000010f7 28983 1726883035.62433: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28983 1726883035.62489: no more pending results, returning what we have 28983 1726883035.62493: results queue empty 28983 1726883035.62494: checking for any_errors_fatal 28983 1726883035.62498: done checking for any_errors_fatal 28983 1726883035.62499: checking for max_fail_percentage 28983 1726883035.62501: done checking for max_fail_percentage 28983 1726883035.62502: checking to see if all hosts have failed and the running result is not ok 28983 1726883035.62503: done checking to see if all hosts have failed 28983 1726883035.62503: getting the remaining hosts for this loop 28983 1726883035.62505: done getting the remaining hosts for this loop 28983 1726883035.62508: getting the next task for host managed_node2 28983 1726883035.62515: done getting next task for host managed_node2 28983 1726883035.62519: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 28983 1726883035.62524: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883035.62550: getting variables 28983 1726883035.62552: in VariableManager get_vars() 28983 1726883035.62590: Calling all_inventory to load vars for managed_node2 28983 1726883035.62593: Calling groups_inventory to load vars for managed_node2 28983 1726883035.62595: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883035.62604: Calling all_plugins_play to load vars for managed_node2 28983 1726883035.62607: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883035.62611: Calling groups_plugins_play to load vars for managed_node2 28983 1726883035.63892: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883035.71100: done with get_vars() 28983 1726883035.71141: done getting variables 28983 1726883035.71196: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:43:55 -0400 (0:00:00.100) 0:01:05.710 ****** 28983 1726883035.71233: entering _queue_task() for managed_node2/fail 28983 1726883035.71591: worker is 1 (out of 1 available) 28983 1726883035.71605: exiting _queue_task() for managed_node2/fail 28983 1726883035.71619: done queuing things up, now waiting for results queue to drain 28983 1726883035.71622: waiting for pending results... 28983 1726883035.71893: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 28983 1726883035.72019: in run() - task 0affe814-3a2d-b16d-c0a7-0000000010f8 28983 1726883035.72034: variable 'ansible_search_path' from source: unknown 28983 1726883035.72038: variable 'ansible_search_path' from source: unknown 28983 1726883035.72073: calling self._execute() 28983 1726883035.72163: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883035.72171: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883035.72183: variable 'omit' from source: magic vars 28983 1726883035.72517: variable 'ansible_distribution_major_version' from source: facts 28983 1726883035.72528: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883035.72639: variable 'network_state' from source: role '' defaults 28983 1726883035.72649: Evaluated conditional (network_state != {}): False 28983 1726883035.72653: when evaluation is False, skipping this task 28983 1726883035.72656: _execute() done 28983 1726883035.72660: dumping result to json 28983 1726883035.72665: done dumping result, returning 28983 1726883035.72673: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affe814-3a2d-b16d-c0a7-0000000010f8] 28983 1726883035.72681: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000010f8 28983 1726883035.72782: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000010f8 28983 1726883035.72785: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28983 1726883035.72840: no more pending results, returning what we have 28983 1726883035.72844: results queue empty 28983 1726883035.72845: checking for any_errors_fatal 28983 1726883035.72856: done checking for any_errors_fatal 28983 1726883035.72857: checking for max_fail_percentage 28983 1726883035.72859: done checking for max_fail_percentage 28983 1726883035.72860: checking to see if all hosts have failed and the running result is not ok 28983 1726883035.72861: done checking to see if all hosts have failed 28983 1726883035.72862: getting the remaining hosts for this loop 28983 1726883035.72864: done getting the remaining hosts for this loop 28983 1726883035.72869: getting the next task for host managed_node2 28983 1726883035.72877: done getting next task for host managed_node2 28983 1726883035.72883: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 28983 1726883035.72888: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883035.72910: getting variables 28983 1726883035.72912: in VariableManager get_vars() 28983 1726883035.72953: Calling all_inventory to load vars for managed_node2 28983 1726883035.72956: Calling groups_inventory to load vars for managed_node2 28983 1726883035.72958: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883035.72967: Calling all_plugins_play to load vars for managed_node2 28983 1726883035.72970: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883035.72973: Calling groups_plugins_play to load vars for managed_node2 28983 1726883035.74652: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883035.76977: done with get_vars() 28983 1726883035.76999: done getting variables 28983 1726883035.77048: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:43:55 -0400 (0:00:00.058) 0:01:05.768 ****** 28983 1726883035.77076: entering _queue_task() for managed_node2/fail 28983 1726883035.77301: worker is 1 (out of 1 available) 28983 1726883035.77315: exiting _queue_task() for managed_node2/fail 28983 1726883035.77328: done queuing things up, now waiting for results queue to drain 28983 1726883035.77330: waiting for pending results... 28983 1726883035.77520: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 28983 1726883035.77636: in run() - task 0affe814-3a2d-b16d-c0a7-0000000010f9 28983 1726883035.77657: variable 'ansible_search_path' from source: unknown 28983 1726883035.77662: variable 'ansible_search_path' from source: unknown 28983 1726883035.77698: calling self._execute() 28983 1726883035.77785: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883035.77793: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883035.77805: variable 'omit' from source: magic vars 28983 1726883035.78153: variable 'ansible_distribution_major_version' from source: facts 28983 1726883035.78157: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883035.78440: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883035.80697: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883035.80754: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883035.80796: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883035.80829: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883035.80853: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883035.80919: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883035.80950: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883035.80969: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883035.81003: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883035.81016: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883035.81096: variable 'ansible_distribution_major_version' from source: facts 28983 1726883035.81111: Evaluated conditional (ansible_distribution_major_version | int > 9): True 28983 1726883035.81207: variable 'ansible_distribution' from source: facts 28983 1726883035.81210: variable '__network_rh_distros' from source: role '' defaults 28983 1726883035.81220: Evaluated conditional (ansible_distribution in __network_rh_distros): False 28983 1726883035.81224: when evaluation is False, skipping this task 28983 1726883035.81227: _execute() done 28983 1726883035.81230: dumping result to json 28983 1726883035.81237: done dumping result, returning 28983 1726883035.81244: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affe814-3a2d-b16d-c0a7-0000000010f9] 28983 1726883035.81250: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000010f9 28983 1726883035.81355: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000010f9 28983 1726883035.81358: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution in __network_rh_distros", "skip_reason": "Conditional result was False" } 28983 1726883035.81413: no more pending results, returning what we have 28983 1726883035.81416: results queue empty 28983 1726883035.81418: checking for any_errors_fatal 28983 1726883035.81424: done checking for any_errors_fatal 28983 1726883035.81425: checking for max_fail_percentage 28983 1726883035.81427: done checking for max_fail_percentage 28983 1726883035.81428: checking to see if all hosts have failed and the running result is not ok 28983 1726883035.81429: done checking to see if all hosts have failed 28983 1726883035.81430: getting the remaining hosts for this loop 28983 1726883035.81432: done getting the remaining hosts for this loop 28983 1726883035.81439: getting the next task for host managed_node2 28983 1726883035.81447: done getting next task for host managed_node2 28983 1726883035.81453: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 28983 1726883035.81459: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883035.81480: getting variables 28983 1726883035.81485: in VariableManager get_vars() 28983 1726883035.81523: Calling all_inventory to load vars for managed_node2 28983 1726883035.81526: Calling groups_inventory to load vars for managed_node2 28983 1726883035.81529: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883035.81543: Calling all_plugins_play to load vars for managed_node2 28983 1726883035.81546: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883035.81550: Calling groups_plugins_play to load vars for managed_node2 28983 1726883035.84014: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883035.88689: done with get_vars() 28983 1726883035.88731: done getting variables 28983 1726883035.88809: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:43:55 -0400 (0:00:00.118) 0:01:05.887 ****** 28983 1726883035.88965: entering _queue_task() for managed_node2/dnf 28983 1726883035.89565: worker is 1 (out of 1 available) 28983 1726883035.89579: exiting _queue_task() for managed_node2/dnf 28983 1726883035.89590: done queuing things up, now waiting for results queue to drain 28983 1726883035.89592: waiting for pending results... 28983 1726883035.89725: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 28983 1726883035.89930: in run() - task 0affe814-3a2d-b16d-c0a7-0000000010fa 28983 1726883035.89935: variable 'ansible_search_path' from source: unknown 28983 1726883035.89945: variable 'ansible_search_path' from source: unknown 28983 1726883035.90142: calling self._execute() 28983 1726883035.90180: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883035.90197: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883035.90219: variable 'omit' from source: magic vars 28983 1726883035.90791: variable 'ansible_distribution_major_version' from source: facts 28983 1726883035.90816: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883035.91096: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883035.94315: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883035.94418: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883035.94542: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883035.94546: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883035.94576: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883035.94681: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883035.94726: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883035.94775: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883035.94837: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883035.94866: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883035.95022: variable 'ansible_distribution' from source: facts 28983 1726883035.95036: variable 'ansible_distribution_major_version' from source: facts 28983 1726883035.95055: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 28983 1726883035.95208: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883035.95410: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883035.95449: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883035.95494: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883035.95553: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883035.95598: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883035.95644: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883035.95707: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883035.95726: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883035.95786: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883035.95816: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883035.95924: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883035.95928: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883035.95952: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883035.96011: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883035.96042: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883035.96270: variable 'network_connections' from source: include params 28983 1726883035.96293: variable 'interface' from source: play vars 28983 1726883035.96382: variable 'interface' from source: play vars 28983 1726883035.96486: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883035.96878: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883035.97163: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883035.97442: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883035.97580: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883035.97643: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883035.97680: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883035.97825: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883035.97832: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883035.97966: variable '__network_team_connections_defined' from source: role '' defaults 28983 1726883035.99058: variable 'network_connections' from source: include params 28983 1726883035.99062: variable 'interface' from source: play vars 28983 1726883035.99430: variable 'interface' from source: play vars 28983 1726883035.99506: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 28983 1726883035.99519: when evaluation is False, skipping this task 28983 1726883035.99532: _execute() done 28983 1726883035.99601: dumping result to json 28983 1726883035.99605: done dumping result, returning 28983 1726883035.99608: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affe814-3a2d-b16d-c0a7-0000000010fa] 28983 1726883035.99610: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000010fa skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 28983 1726883036.00106: no more pending results, returning what we have 28983 1726883036.00110: results queue empty 28983 1726883036.00111: checking for any_errors_fatal 28983 1726883036.00119: done checking for any_errors_fatal 28983 1726883036.00120: checking for max_fail_percentage 28983 1726883036.00122: done checking for max_fail_percentage 28983 1726883036.00123: checking to see if all hosts have failed and the running result is not ok 28983 1726883036.00124: done checking to see if all hosts have failed 28983 1726883036.00125: getting the remaining hosts for this loop 28983 1726883036.00127: done getting the remaining hosts for this loop 28983 1726883036.00132: getting the next task for host managed_node2 28983 1726883036.00149: done getting next task for host managed_node2 28983 1726883036.00155: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 28983 1726883036.00161: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883036.00189: getting variables 28983 1726883036.00191: in VariableManager get_vars() 28983 1726883036.00314: Calling all_inventory to load vars for managed_node2 28983 1726883036.00318: Calling groups_inventory to load vars for managed_node2 28983 1726883036.00322: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883036.00367: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000010fa 28983 1726883036.00370: WORKER PROCESS EXITING 28983 1726883036.00383: Calling all_plugins_play to load vars for managed_node2 28983 1726883036.00387: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883036.00395: Calling groups_plugins_play to load vars for managed_node2 28983 1726883036.03942: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883036.09113: done with get_vars() 28983 1726883036.09154: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 28983 1726883036.09248: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:43:56 -0400 (0:00:00.203) 0:01:06.090 ****** 28983 1726883036.09293: entering _queue_task() for managed_node2/yum 28983 1726883036.09812: worker is 1 (out of 1 available) 28983 1726883036.09828: exiting _queue_task() for managed_node2/yum 28983 1726883036.09844: done queuing things up, now waiting for results queue to drain 28983 1726883036.09846: waiting for pending results... 28983 1726883036.10225: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 28983 1726883036.10425: in run() - task 0affe814-3a2d-b16d-c0a7-0000000010fb 28983 1726883036.10455: variable 'ansible_search_path' from source: unknown 28983 1726883036.10465: variable 'ansible_search_path' from source: unknown 28983 1726883036.10514: calling self._execute() 28983 1726883036.10640: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883036.10655: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883036.10681: variable 'omit' from source: magic vars 28983 1726883036.11164: variable 'ansible_distribution_major_version' from source: facts 28983 1726883036.11188: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883036.11479: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883036.15543: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883036.15546: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883036.15683: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883036.15725: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883036.15762: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883036.16111: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883036.16147: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883036.16181: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883036.16539: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883036.16543: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883036.16600: variable 'ansible_distribution_major_version' from source: facts 28983 1726883036.16624: Evaluated conditional (ansible_distribution_major_version | int < 8): False 28983 1726883036.16628: when evaluation is False, skipping this task 28983 1726883036.16631: _execute() done 28983 1726883036.16760: dumping result to json 28983 1726883036.16768: done dumping result, returning 28983 1726883036.16772: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affe814-3a2d-b16d-c0a7-0000000010fb] 28983 1726883036.16784: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000010fb 28983 1726883036.16902: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000010fb 28983 1726883036.16906: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 28983 1726883036.16968: no more pending results, returning what we have 28983 1726883036.16975: results queue empty 28983 1726883036.16976: checking for any_errors_fatal 28983 1726883036.16985: done checking for any_errors_fatal 28983 1726883036.16986: checking for max_fail_percentage 28983 1726883036.16988: done checking for max_fail_percentage 28983 1726883036.16989: checking to see if all hosts have failed and the running result is not ok 28983 1726883036.16990: done checking to see if all hosts have failed 28983 1726883036.16991: getting the remaining hosts for this loop 28983 1726883036.16993: done getting the remaining hosts for this loop 28983 1726883036.16999: getting the next task for host managed_node2 28983 1726883036.17010: done getting next task for host managed_node2 28983 1726883036.17015: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 28983 1726883036.17022: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883036.17050: getting variables 28983 1726883036.17052: in VariableManager get_vars() 28983 1726883036.17105: Calling all_inventory to load vars for managed_node2 28983 1726883036.17109: Calling groups_inventory to load vars for managed_node2 28983 1726883036.17112: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883036.17123: Calling all_plugins_play to load vars for managed_node2 28983 1726883036.17127: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883036.17132: Calling groups_plugins_play to load vars for managed_node2 28983 1726883036.21604: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883036.25940: done with get_vars() 28983 1726883036.25987: done getting variables 28983 1726883036.26060: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:43:56 -0400 (0:00:00.168) 0:01:06.258 ****** 28983 1726883036.26111: entering _queue_task() for managed_node2/fail 28983 1726883036.26819: worker is 1 (out of 1 available) 28983 1726883036.27061: exiting _queue_task() for managed_node2/fail 28983 1726883036.27078: done queuing things up, now waiting for results queue to drain 28983 1726883036.27081: waiting for pending results... 28983 1726883036.27929: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 28983 1726883036.28302: in run() - task 0affe814-3a2d-b16d-c0a7-0000000010fc 28983 1726883036.28425: variable 'ansible_search_path' from source: unknown 28983 1726883036.28439: variable 'ansible_search_path' from source: unknown 28983 1726883036.28582: calling self._execute() 28983 1726883036.28789: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883036.28799: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883036.28804: variable 'omit' from source: magic vars 28983 1726883036.29843: variable 'ansible_distribution_major_version' from source: facts 28983 1726883036.29864: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883036.30046: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883036.30340: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883036.38045: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883036.38087: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883036.38171: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883036.38229: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883036.38271: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883036.38383: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883036.38483: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883036.38487: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883036.38537: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883036.38560: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883036.38641: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883036.38681: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883036.38726: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883036.38787: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883036.38814: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883036.38917: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883036.38921: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883036.38956: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883036.39013: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883036.39041: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883036.39296: variable 'network_connections' from source: include params 28983 1726883036.39316: variable 'interface' from source: play vars 28983 1726883036.39490: variable 'interface' from source: play vars 28983 1726883036.39529: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883036.39762: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883036.39822: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883036.39922: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883036.39925: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883036.39993: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883036.40025: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883036.40071: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883036.40117: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883036.40212: variable '__network_team_connections_defined' from source: role '' defaults 28983 1726883036.40755: variable 'network_connections' from source: include params 28983 1726883036.40758: variable 'interface' from source: play vars 28983 1726883036.40779: variable 'interface' from source: play vars 28983 1726883036.40821: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 28983 1726883036.40831: when evaluation is False, skipping this task 28983 1726883036.40841: _execute() done 28983 1726883036.40853: dumping result to json 28983 1726883036.40871: done dumping result, returning 28983 1726883036.40891: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affe814-3a2d-b16d-c0a7-0000000010fc] 28983 1726883036.40902: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000010fc skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 28983 1726883036.41296: no more pending results, returning what we have 28983 1726883036.41300: results queue empty 28983 1726883036.41301: checking for any_errors_fatal 28983 1726883036.41308: done checking for any_errors_fatal 28983 1726883036.41309: checking for max_fail_percentage 28983 1726883036.41311: done checking for max_fail_percentage 28983 1726883036.41312: checking to see if all hosts have failed and the running result is not ok 28983 1726883036.41313: done checking to see if all hosts have failed 28983 1726883036.41314: getting the remaining hosts for this loop 28983 1726883036.41316: done getting the remaining hosts for this loop 28983 1726883036.41321: getting the next task for host managed_node2 28983 1726883036.41331: done getting next task for host managed_node2 28983 1726883036.41339: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 28983 1726883036.41346: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883036.41374: getting variables 28983 1726883036.41377: in VariableManager get_vars() 28983 1726883036.41423: Calling all_inventory to load vars for managed_node2 28983 1726883036.41427: Calling groups_inventory to load vars for managed_node2 28983 1726883036.41429: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883036.41615: Calling all_plugins_play to load vars for managed_node2 28983 1726883036.41620: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883036.41668: Calling groups_plugins_play to load vars for managed_node2 28983 1726883036.42300: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000010fc 28983 1726883036.42304: WORKER PROCESS EXITING 28983 1726883036.44615: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883036.47785: done with get_vars() 28983 1726883036.47821: done getting variables 28983 1726883036.47900: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:43:56 -0400 (0:00:00.218) 0:01:06.477 ****** 28983 1726883036.47948: entering _queue_task() for managed_node2/package 28983 1726883036.48571: worker is 1 (out of 1 available) 28983 1726883036.48585: exiting _queue_task() for managed_node2/package 28983 1726883036.48597: done queuing things up, now waiting for results queue to drain 28983 1726883036.48600: waiting for pending results... 28983 1726883036.48713: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 28983 1726883036.48909: in run() - task 0affe814-3a2d-b16d-c0a7-0000000010fd 28983 1726883036.48938: variable 'ansible_search_path' from source: unknown 28983 1726883036.49040: variable 'ansible_search_path' from source: unknown 28983 1726883036.49046: calling self._execute() 28983 1726883036.49130: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883036.49147: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883036.49239: variable 'omit' from source: magic vars 28983 1726883036.49663: variable 'ansible_distribution_major_version' from source: facts 28983 1726883036.49686: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883036.49965: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883036.50325: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883036.50396: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883036.50442: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883036.50540: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883036.50704: variable 'network_packages' from source: role '' defaults 28983 1726883036.50860: variable '__network_provider_setup' from source: role '' defaults 28983 1726883036.50882: variable '__network_service_name_default_nm' from source: role '' defaults 28983 1726883036.51017: variable '__network_service_name_default_nm' from source: role '' defaults 28983 1726883036.51023: variable '__network_packages_default_nm' from source: role '' defaults 28983 1726883036.51084: variable '__network_packages_default_nm' from source: role '' defaults 28983 1726883036.51385: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883036.53999: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883036.54094: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883036.54143: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883036.54239: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883036.54243: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883036.54359: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883036.54411: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883036.54452: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883036.54523: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883036.54616: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883036.54620: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883036.54661: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883036.54701: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883036.54767: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883036.54793: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883036.55131: variable '__network_packages_default_gobject_packages' from source: role '' defaults 28983 1726883036.55301: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883036.55340: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883036.55381: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883036.55446: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883036.55511: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883036.55600: variable 'ansible_python' from source: facts 28983 1726883036.55633: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 28983 1726883036.55750: variable '__network_wpa_supplicant_required' from source: role '' defaults 28983 1726883036.55868: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 28983 1726883036.56241: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883036.56245: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883036.56248: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883036.56250: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883036.56252: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883036.56283: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883036.56327: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883036.56378: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883036.56437: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883036.56462: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883036.56672: variable 'network_connections' from source: include params 28983 1726883036.56689: variable 'interface' from source: play vars 28983 1726883036.56830: variable 'interface' from source: play vars 28983 1726883036.56931: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883036.56976: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883036.57024: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883036.57076: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883036.57244: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883036.57567: variable 'network_connections' from source: include params 28983 1726883036.57585: variable 'interface' from source: play vars 28983 1726883036.57718: variable 'interface' from source: play vars 28983 1726883036.57800: variable '__network_packages_default_wireless' from source: role '' defaults 28983 1726883036.57916: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883036.58442: variable 'network_connections' from source: include params 28983 1726883036.58447: variable 'interface' from source: play vars 28983 1726883036.58479: variable 'interface' from source: play vars 28983 1726883036.58515: variable '__network_packages_default_team' from source: role '' defaults 28983 1726883036.58632: variable '__network_team_connections_defined' from source: role '' defaults 28983 1726883036.59110: variable 'network_connections' from source: include params 28983 1726883036.59122: variable 'interface' from source: play vars 28983 1726883036.59219: variable 'interface' from source: play vars 28983 1726883036.59317: variable '__network_service_name_default_initscripts' from source: role '' defaults 28983 1726883036.59426: variable '__network_service_name_default_initscripts' from source: role '' defaults 28983 1726883036.59434: variable '__network_packages_default_initscripts' from source: role '' defaults 28983 1726883036.59844: variable '__network_packages_default_initscripts' from source: role '' defaults 28983 1726883036.60336: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 28983 1726883036.61856: variable 'network_connections' from source: include params 28983 1726883036.61868: variable 'interface' from source: play vars 28983 1726883036.61950: variable 'interface' from source: play vars 28983 1726883036.62090: variable 'ansible_distribution' from source: facts 28983 1726883036.62100: variable '__network_rh_distros' from source: role '' defaults 28983 1726883036.62113: variable 'ansible_distribution_major_version' from source: facts 28983 1726883036.62193: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 28983 1726883036.62622: variable 'ansible_distribution' from source: facts 28983 1726883036.62840: variable '__network_rh_distros' from source: role '' defaults 28983 1726883036.62847: variable 'ansible_distribution_major_version' from source: facts 28983 1726883036.62850: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 28983 1726883036.63200: variable 'ansible_distribution' from source: facts 28983 1726883036.63292: variable '__network_rh_distros' from source: role '' defaults 28983 1726883036.63304: variable 'ansible_distribution_major_version' from source: facts 28983 1726883036.63353: variable 'network_provider' from source: set_fact 28983 1726883036.63415: variable 'ansible_facts' from source: unknown 28983 1726883036.66384: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 28983 1726883036.66541: when evaluation is False, skipping this task 28983 1726883036.66545: _execute() done 28983 1726883036.66547: dumping result to json 28983 1726883036.66549: done dumping result, returning 28983 1726883036.66551: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [0affe814-3a2d-b16d-c0a7-0000000010fd] 28983 1726883036.66554: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000010fd skipping: [managed_node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 28983 1726883036.66867: no more pending results, returning what we have 28983 1726883036.66871: results queue empty 28983 1726883036.66874: checking for any_errors_fatal 28983 1726883036.66885: done checking for any_errors_fatal 28983 1726883036.66886: checking for max_fail_percentage 28983 1726883036.66888: done checking for max_fail_percentage 28983 1726883036.66889: checking to see if all hosts have failed and the running result is not ok 28983 1726883036.66890: done checking to see if all hosts have failed 28983 1726883036.66891: getting the remaining hosts for this loop 28983 1726883036.66893: done getting the remaining hosts for this loop 28983 1726883036.66899: getting the next task for host managed_node2 28983 1726883036.66909: done getting next task for host managed_node2 28983 1726883036.66913: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 28983 1726883036.66920: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883036.67147: getting variables 28983 1726883036.67150: in VariableManager get_vars() 28983 1726883036.67202: Calling all_inventory to load vars for managed_node2 28983 1726883036.67206: Calling groups_inventory to load vars for managed_node2 28983 1726883036.67214: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883036.67226: Calling all_plugins_play to load vars for managed_node2 28983 1726883036.67230: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883036.67350: Calling groups_plugins_play to load vars for managed_node2 28983 1726883036.67364: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000010fd 28983 1726883036.67368: WORKER PROCESS EXITING 28983 1726883036.72055: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883036.78979: done with get_vars() 28983 1726883036.79017: done getting variables 28983 1726883036.79226: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:43:56 -0400 (0:00:00.313) 0:01:06.790 ****** 28983 1726883036.79342: entering _queue_task() for managed_node2/package 28983 1726883036.80202: worker is 1 (out of 1 available) 28983 1726883036.80216: exiting _queue_task() for managed_node2/package 28983 1726883036.80241: done queuing things up, now waiting for results queue to drain 28983 1726883036.80243: waiting for pending results... 28983 1726883036.80814: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 28983 1726883036.81375: in run() - task 0affe814-3a2d-b16d-c0a7-0000000010fe 28983 1726883036.81380: variable 'ansible_search_path' from source: unknown 28983 1726883036.81384: variable 'ansible_search_path' from source: unknown 28983 1726883036.81408: calling self._execute() 28983 1726883036.81646: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883036.81660: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883036.81717: variable 'omit' from source: magic vars 28983 1726883036.82630: variable 'ansible_distribution_major_version' from source: facts 28983 1726883036.82699: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883036.83162: variable 'network_state' from source: role '' defaults 28983 1726883036.83166: Evaluated conditional (network_state != {}): False 28983 1726883036.83169: when evaluation is False, skipping this task 28983 1726883036.83171: _execute() done 28983 1726883036.83176: dumping result to json 28983 1726883036.83179: done dumping result, returning 28983 1726883036.83182: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affe814-3a2d-b16d-c0a7-0000000010fe] 28983 1726883036.83188: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000010fe 28983 1726883036.83428: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000010fe 28983 1726883036.83432: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28983 1726883036.83490: no more pending results, returning what we have 28983 1726883036.83495: results queue empty 28983 1726883036.83496: checking for any_errors_fatal 28983 1726883036.83505: done checking for any_errors_fatal 28983 1726883036.83506: checking for max_fail_percentage 28983 1726883036.83509: done checking for max_fail_percentage 28983 1726883036.83510: checking to see if all hosts have failed and the running result is not ok 28983 1726883036.83511: done checking to see if all hosts have failed 28983 1726883036.83512: getting the remaining hosts for this loop 28983 1726883036.83514: done getting the remaining hosts for this loop 28983 1726883036.83519: getting the next task for host managed_node2 28983 1726883036.83530: done getting next task for host managed_node2 28983 1726883036.83538: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 28983 1726883036.83546: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883036.83583: getting variables 28983 1726883036.83585: in VariableManager get_vars() 28983 1726883036.83629: Calling all_inventory to load vars for managed_node2 28983 1726883036.83632: Calling groups_inventory to load vars for managed_node2 28983 1726883036.83938: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883036.83949: Calling all_plugins_play to load vars for managed_node2 28983 1726883036.83953: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883036.83957: Calling groups_plugins_play to load vars for managed_node2 28983 1726883036.88127: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883036.94323: done with get_vars() 28983 1726883036.94430: done getting variables 28983 1726883036.94623: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:43:56 -0400 (0:00:00.153) 0:01:06.944 ****** 28983 1726883036.94669: entering _queue_task() for managed_node2/package 28983 1726883036.95529: worker is 1 (out of 1 available) 28983 1726883036.95607: exiting _queue_task() for managed_node2/package 28983 1726883036.95622: done queuing things up, now waiting for results queue to drain 28983 1726883036.95624: waiting for pending results... 28983 1726883036.96055: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 28983 1726883036.96414: in run() - task 0affe814-3a2d-b16d-c0a7-0000000010ff 28983 1726883036.96431: variable 'ansible_search_path' from source: unknown 28983 1726883036.96436: variable 'ansible_search_path' from source: unknown 28983 1726883036.96479: calling self._execute() 28983 1726883036.96800: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883036.96806: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883036.96816: variable 'omit' from source: magic vars 28983 1726883036.97671: variable 'ansible_distribution_major_version' from source: facts 28983 1726883036.97739: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883036.98033: variable 'network_state' from source: role '' defaults 28983 1726883036.98049: Evaluated conditional (network_state != {}): False 28983 1726883036.98052: when evaluation is False, skipping this task 28983 1726883036.98055: _execute() done 28983 1726883036.98058: dumping result to json 28983 1726883036.98064: done dumping result, returning 28983 1726883036.98076: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affe814-3a2d-b16d-c0a7-0000000010ff] 28983 1726883036.98079: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000010ff 28983 1726883036.98204: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000010ff 28983 1726883036.98208: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28983 1726883036.98283: no more pending results, returning what we have 28983 1726883036.98288: results queue empty 28983 1726883036.98289: checking for any_errors_fatal 28983 1726883036.98297: done checking for any_errors_fatal 28983 1726883036.98298: checking for max_fail_percentage 28983 1726883036.98300: done checking for max_fail_percentage 28983 1726883036.98301: checking to see if all hosts have failed and the running result is not ok 28983 1726883036.98302: done checking to see if all hosts have failed 28983 1726883036.98303: getting the remaining hosts for this loop 28983 1726883036.98306: done getting the remaining hosts for this loop 28983 1726883036.98311: getting the next task for host managed_node2 28983 1726883036.98434: done getting next task for host managed_node2 28983 1726883036.98439: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 28983 1726883036.98446: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883036.98475: getting variables 28983 1726883036.98476: in VariableManager get_vars() 28983 1726883036.98524: Calling all_inventory to load vars for managed_node2 28983 1726883036.98528: Calling groups_inventory to load vars for managed_node2 28983 1726883036.98530: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883036.98659: Calling all_plugins_play to load vars for managed_node2 28983 1726883036.98663: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883036.98667: Calling groups_plugins_play to load vars for managed_node2 28983 1726883037.03763: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883037.09856: done with get_vars() 28983 1726883037.09904: done getting variables 28983 1726883037.10183: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:43:57 -0400 (0:00:00.155) 0:01:07.100 ****** 28983 1726883037.10229: entering _queue_task() for managed_node2/service 28983 1726883037.11042: worker is 1 (out of 1 available) 28983 1726883037.11057: exiting _queue_task() for managed_node2/service 28983 1726883037.11072: done queuing things up, now waiting for results queue to drain 28983 1726883037.11076: waiting for pending results... 28983 1726883037.11756: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 28983 1726883037.11951: in run() - task 0affe814-3a2d-b16d-c0a7-000000001100 28983 1726883037.11967: variable 'ansible_search_path' from source: unknown 28983 1726883037.11970: variable 'ansible_search_path' from source: unknown 28983 1726883037.12128: calling self._execute() 28983 1726883037.12250: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883037.12258: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883037.12270: variable 'omit' from source: magic vars 28983 1726883037.13442: variable 'ansible_distribution_major_version' from source: facts 28983 1726883037.13455: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883037.13727: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883037.14285: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883037.19343: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883037.19348: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883037.19350: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883037.19377: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883037.19405: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883037.19504: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883037.19539: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883037.19579: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883037.19627: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883037.19646: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883037.19740: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883037.19743: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883037.19768: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883037.19823: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883037.19841: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883037.19897: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883037.19925: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883037.20104: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883037.20108: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883037.20110: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883037.20262: variable 'network_connections' from source: include params 28983 1726883037.20278: variable 'interface' from source: play vars 28983 1726883037.20367: variable 'interface' from source: play vars 28983 1726883037.20460: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883037.20666: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883037.20722: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883037.20763: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883037.20980: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883037.20983: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883037.20986: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883037.20988: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883037.20991: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883037.21019: variable '__network_team_connections_defined' from source: role '' defaults 28983 1726883037.21360: variable 'network_connections' from source: include params 28983 1726883037.21366: variable 'interface' from source: play vars 28983 1726883037.21444: variable 'interface' from source: play vars 28983 1726883037.21481: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 28983 1726883037.21485: when evaluation is False, skipping this task 28983 1726883037.21488: _execute() done 28983 1726883037.21493: dumping result to json 28983 1726883037.21498: done dumping result, returning 28983 1726883037.21508: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affe814-3a2d-b16d-c0a7-000000001100] 28983 1726883037.21514: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001100 28983 1726883037.21912: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001100 28983 1726883037.21921: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 28983 1726883037.21992: no more pending results, returning what we have 28983 1726883037.21996: results queue empty 28983 1726883037.21997: checking for any_errors_fatal 28983 1726883037.22004: done checking for any_errors_fatal 28983 1726883037.22005: checking for max_fail_percentage 28983 1726883037.22007: done checking for max_fail_percentage 28983 1726883037.22008: checking to see if all hosts have failed and the running result is not ok 28983 1726883037.22009: done checking to see if all hosts have failed 28983 1726883037.22010: getting the remaining hosts for this loop 28983 1726883037.22012: done getting the remaining hosts for this loop 28983 1726883037.22016: getting the next task for host managed_node2 28983 1726883037.22024: done getting next task for host managed_node2 28983 1726883037.22029: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 28983 1726883037.22038: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883037.22059: getting variables 28983 1726883037.22061: in VariableManager get_vars() 28983 1726883037.22105: Calling all_inventory to load vars for managed_node2 28983 1726883037.22108: Calling groups_inventory to load vars for managed_node2 28983 1726883037.22111: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883037.22121: Calling all_plugins_play to load vars for managed_node2 28983 1726883037.22124: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883037.22127: Calling groups_plugins_play to load vars for managed_node2 28983 1726883037.26495: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883037.30339: done with get_vars() 28983 1726883037.30378: done getting variables 28983 1726883037.30458: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:43:57 -0400 (0:00:00.202) 0:01:07.302 ****** 28983 1726883037.30503: entering _queue_task() for managed_node2/service 28983 1726883037.31121: worker is 1 (out of 1 available) 28983 1726883037.31141: exiting _queue_task() for managed_node2/service 28983 1726883037.31159: done queuing things up, now waiting for results queue to drain 28983 1726883037.31161: waiting for pending results... 28983 1726883037.31689: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 28983 1726883037.31887: in run() - task 0affe814-3a2d-b16d-c0a7-000000001101 28983 1726883037.31916: variable 'ansible_search_path' from source: unknown 28983 1726883037.31925: variable 'ansible_search_path' from source: unknown 28983 1726883037.31978: calling self._execute() 28983 1726883037.32126: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883037.32151: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883037.32238: variable 'omit' from source: magic vars 28983 1726883037.32670: variable 'ansible_distribution_major_version' from source: facts 28983 1726883037.32697: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883037.32940: variable 'network_provider' from source: set_fact 28983 1726883037.32953: variable 'network_state' from source: role '' defaults 28983 1726883037.32969: Evaluated conditional (network_provider == "nm" or network_state != {}): True 28983 1726883037.32984: variable 'omit' from source: magic vars 28983 1726883037.33081: variable 'omit' from source: magic vars 28983 1726883037.33129: variable 'network_service_name' from source: role '' defaults 28983 1726883037.33221: variable 'network_service_name' from source: role '' defaults 28983 1726883037.33381: variable '__network_provider_setup' from source: role '' defaults 28983 1726883037.33434: variable '__network_service_name_default_nm' from source: role '' defaults 28983 1726883037.33487: variable '__network_service_name_default_nm' from source: role '' defaults 28983 1726883037.33500: variable '__network_packages_default_nm' from source: role '' defaults 28983 1726883037.33600: variable '__network_packages_default_nm' from source: role '' defaults 28983 1726883037.34055: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883037.37519: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883037.37626: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883037.37741: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883037.37838: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883037.37984: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883037.38341: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883037.38346: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883037.38440: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883037.38448: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883037.38484: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883037.38608: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883037.38713: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883037.38753: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883037.38906: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883037.38933: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883037.39665: variable '__network_packages_default_gobject_packages' from source: role '' defaults 28983 1726883037.40017: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883037.40055: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883037.40096: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883037.40443: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883037.40447: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883037.40536: variable 'ansible_python' from source: facts 28983 1726883037.40568: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 28983 1726883037.40783: variable '__network_wpa_supplicant_required' from source: role '' defaults 28983 1726883037.40984: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 28983 1726883037.41426: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883037.41429: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883037.41641: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883037.41645: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883037.41647: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883037.41803: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883037.42041: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883037.42044: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883037.42082: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883037.42107: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883037.42485: variable 'network_connections' from source: include params 28983 1726883037.42526: variable 'interface' from source: play vars 28983 1726883037.42716: variable 'interface' from source: play vars 28983 1726883037.43141: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883037.43841: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883037.43845: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883037.43900: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883037.43961: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883037.44039: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883037.44185: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883037.44229: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883037.44322: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883037.44506: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883037.45293: variable 'network_connections' from source: include params 28983 1726883037.45390: variable 'interface' from source: play vars 28983 1726883037.45601: variable 'interface' from source: play vars 28983 1726883037.45669: variable '__network_packages_default_wireless' from source: role '' defaults 28983 1726883037.45901: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883037.46909: variable 'network_connections' from source: include params 28983 1726883037.46913: variable 'interface' from source: play vars 28983 1726883037.46965: variable 'interface' from source: play vars 28983 1726883037.47052: variable '__network_packages_default_team' from source: role '' defaults 28983 1726883037.47230: variable '__network_team_connections_defined' from source: role '' defaults 28983 1726883037.48167: variable 'network_connections' from source: include params 28983 1726883037.48440: variable 'interface' from source: play vars 28983 1726883037.48444: variable 'interface' from source: play vars 28983 1726883037.48540: variable '__network_service_name_default_initscripts' from source: role '' defaults 28983 1726883037.48753: variable '__network_service_name_default_initscripts' from source: role '' defaults 28983 1726883037.48792: variable '__network_packages_default_initscripts' from source: role '' defaults 28983 1726883037.49107: variable '__network_packages_default_initscripts' from source: role '' defaults 28983 1726883037.49636: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 28983 1726883037.51087: variable 'network_connections' from source: include params 28983 1726883037.51098: variable 'interface' from source: play vars 28983 1726883037.51204: variable 'interface' from source: play vars 28983 1726883037.51252: variable 'ansible_distribution' from source: facts 28983 1726883037.51488: variable '__network_rh_distros' from source: role '' defaults 28983 1726883037.51492: variable 'ansible_distribution_major_version' from source: facts 28983 1726883037.51494: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 28983 1726883037.51867: variable 'ansible_distribution' from source: facts 28983 1726883037.51881: variable '__network_rh_distros' from source: role '' defaults 28983 1726883037.51894: variable 'ansible_distribution_major_version' from source: facts 28983 1726883037.51937: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 28983 1726883037.52501: variable 'ansible_distribution' from source: facts 28983 1726883037.52513: variable '__network_rh_distros' from source: role '' defaults 28983 1726883037.52524: variable 'ansible_distribution_major_version' from source: facts 28983 1726883037.52578: variable 'network_provider' from source: set_fact 28983 1726883037.52616: variable 'omit' from source: magic vars 28983 1726883037.52732: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883037.52776: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883037.52827: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883037.53132: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883037.53137: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883037.53140: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883037.53143: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883037.53146: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883037.53366: Set connection var ansible_connection to ssh 28983 1726883037.53390: Set connection var ansible_shell_executable to /bin/sh 28983 1726883037.53406: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883037.53422: Set connection var ansible_timeout to 10 28983 1726883037.53436: Set connection var ansible_pipelining to False 28983 1726883037.53444: Set connection var ansible_shell_type to sh 28983 1726883037.53570: variable 'ansible_shell_executable' from source: unknown 28983 1726883037.53581: variable 'ansible_connection' from source: unknown 28983 1726883037.53591: variable 'ansible_module_compression' from source: unknown 28983 1726883037.53601: variable 'ansible_shell_type' from source: unknown 28983 1726883037.53609: variable 'ansible_shell_executable' from source: unknown 28983 1726883037.53617: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883037.53627: variable 'ansible_pipelining' from source: unknown 28983 1726883037.53636: variable 'ansible_timeout' from source: unknown 28983 1726883037.53646: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883037.54121: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883037.54132: variable 'omit' from source: magic vars 28983 1726883037.54137: starting attempt loop 28983 1726883037.54140: running the handler 28983 1726883037.54286: variable 'ansible_facts' from source: unknown 28983 1726883037.57194: _low_level_execute_command(): starting 28983 1726883037.57362: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726883037.59193: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883037.59264: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883037.59428: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883037.59458: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883037.59604: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883037.59723: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883037.61498: stdout chunk (state=3): >>>/root <<< 28983 1726883037.61675: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883037.61703: stderr chunk (state=3): >>><<< 28983 1726883037.61783: stdout chunk (state=3): >>><<< 28983 1726883037.61811: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883037.62039: _low_level_execute_command(): starting 28983 1726883037.62044: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726883037.6188612-31336-214818745192013 `" && echo ansible-tmp-1726883037.6188612-31336-214818745192013="` echo /root/.ansible/tmp/ansible-tmp-1726883037.6188612-31336-214818745192013 `" ) && sleep 0' 28983 1726883037.63311: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883037.63448: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883037.63526: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883037.63543: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883037.63640: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883037.63724: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883037.65793: stdout chunk (state=3): >>>ansible-tmp-1726883037.6188612-31336-214818745192013=/root/.ansible/tmp/ansible-tmp-1726883037.6188612-31336-214818745192013 <<< 28983 1726883037.66036: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883037.66099: stderr chunk (state=3): >>><<< 28983 1726883037.66131: stdout chunk (state=3): >>><<< 28983 1726883037.66159: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726883037.6188612-31336-214818745192013=/root/.ansible/tmp/ansible-tmp-1726883037.6188612-31336-214818745192013 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883037.66265: variable 'ansible_module_compression' from source: unknown 28983 1726883037.66542: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 28983 1726883037.66546: variable 'ansible_facts' from source: unknown 28983 1726883037.67004: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726883037.6188612-31336-214818745192013/AnsiballZ_systemd.py 28983 1726883037.67552: Sending initial data 28983 1726883037.67558: Sent initial data (156 bytes) 28983 1726883037.69525: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883037.69630: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883037.70019: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883037.70039: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883037.70639: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883037.70745: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883037.72508: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726883037.72569: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726883037.72645: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpel467srn /root/.ansible/tmp/ansible-tmp-1726883037.6188612-31336-214818745192013/AnsiballZ_systemd.py <<< 28983 1726883037.72649: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726883037.6188612-31336-214818745192013/AnsiballZ_systemd.py" <<< 28983 1726883037.72700: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpel467srn" to remote "/root/.ansible/tmp/ansible-tmp-1726883037.6188612-31336-214818745192013/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726883037.6188612-31336-214818745192013/AnsiballZ_systemd.py" <<< 28983 1726883037.77443: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883037.77447: stdout chunk (state=3): >>><<< 28983 1726883037.77452: stderr chunk (state=3): >>><<< 28983 1726883037.77455: done transferring module to remote 28983 1726883037.77458: _low_level_execute_command(): starting 28983 1726883037.77680: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726883037.6188612-31336-214818745192013/ /root/.ansible/tmp/ansible-tmp-1726883037.6188612-31336-214818745192013/AnsiballZ_systemd.py && sleep 0' 28983 1726883037.78999: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883037.79128: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883037.79231: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883037.79300: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883037.81654: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883037.81658: stdout chunk (state=3): >>><<< 28983 1726883037.81660: stderr chunk (state=3): >>><<< 28983 1726883037.81663: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883037.81665: _low_level_execute_command(): starting 28983 1726883037.81668: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726883037.6188612-31336-214818745192013/AnsiballZ_systemd.py && sleep 0' 28983 1726883037.82863: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883037.82921: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726883037.83152: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883037.83422: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883037.83746: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883038.16279: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3307", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ExecMainStartTimestampMonotonic": "237115079", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3307", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4837", "MemoryCurrent": "4452352", "MemoryAvailable": "infinity", "CPUUsageNSec": "1562606000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "<<< 28983 1726883038.16321: stdout chunk (state=3): >>>infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target NetworkManager-wait-online.service cloud-init.service network.target network.service multi-user.target", "After": "systemd-journald.socket dbus-broker.service dbus.socket system.slice cloud-init-local.service sysinit.target network-pre.target basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": <<< 28983 1726883038.16333: stdout chunk (state=3): >>>"loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:40:53 EDT", "StateChangeTimestampMonotonic": "816797668", "InactiveExitTimestamp": "Fri 2024-09-20 21:31:13 EDT", "InactiveExitTimestampMonotonic": "237115438", "ActiveEnterTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ActiveEnterTimestampMonotonic": "237210316", "ActiveExitTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ActiveExitTimestampMonotonic": "237082173", "InactiveEnterTimestamp": "Fri 2024-09-20 21:31:13 EDT", "InactiveEnterTimestampMonotonic": "237109124", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ConditionTimestampMonotonic": "237109810", "AssertTimestamp": "Fri 2024-09-20 21:31:13 EDT", "AssertTimestampMonotonic": "237109813", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ac5f6302898c463683c4bdf580ab7e3e", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 28983 1726883038.18448: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. <<< 28983 1726883038.18456: stdout chunk (state=3): >>><<< 28983 1726883038.18459: stderr chunk (state=3): >>><<< 28983 1726883038.18644: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3307", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ExecMainStartTimestampMonotonic": "237115079", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3307", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4837", "MemoryCurrent": "4452352", "MemoryAvailable": "infinity", "CPUUsageNSec": "1562606000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target NetworkManager-wait-online.service cloud-init.service network.target network.service multi-user.target", "After": "systemd-journald.socket dbus-broker.service dbus.socket system.slice cloud-init-local.service sysinit.target network-pre.target basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:40:53 EDT", "StateChangeTimestampMonotonic": "816797668", "InactiveExitTimestamp": "Fri 2024-09-20 21:31:13 EDT", "InactiveExitTimestampMonotonic": "237115438", "ActiveEnterTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ActiveEnterTimestampMonotonic": "237210316", "ActiveExitTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ActiveExitTimestampMonotonic": "237082173", "InactiveEnterTimestamp": "Fri 2024-09-20 21:31:13 EDT", "InactiveEnterTimestampMonotonic": "237109124", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ConditionTimestampMonotonic": "237109810", "AssertTimestamp": "Fri 2024-09-20 21:31:13 EDT", "AssertTimestampMonotonic": "237109813", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ac5f6302898c463683c4bdf580ab7e3e", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726883038.19262: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726883037.6188612-31336-214818745192013/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726883038.19410: _low_level_execute_command(): starting 28983 1726883038.19432: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726883037.6188612-31336-214818745192013/ > /dev/null 2>&1 && sleep 0' 28983 1726883038.20399: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883038.20458: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883038.20480: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883038.20508: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883038.20567: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726883038.20585: stderr chunk (state=3): >>>debug2: match not found <<< 28983 1726883038.20655: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883038.20721: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883038.20762: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883038.20797: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883038.20996: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883038.23006: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883038.23280: stderr chunk (state=3): >>><<< 28983 1726883038.23285: stdout chunk (state=3): >>><<< 28983 1726883038.23288: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883038.23291: handler run complete 28983 1726883038.23293: attempt loop complete, returning result 28983 1726883038.23295: _execute() done 28983 1726883038.23297: dumping result to json 28983 1726883038.23299: done dumping result, returning 28983 1726883038.23301: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affe814-3a2d-b16d-c0a7-000000001101] 28983 1726883038.23304: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001101 28983 1726883038.24237: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001101 28983 1726883038.24242: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28983 1726883038.24309: no more pending results, returning what we have 28983 1726883038.24313: results queue empty 28983 1726883038.24314: checking for any_errors_fatal 28983 1726883038.24540: done checking for any_errors_fatal 28983 1726883038.24542: checking for max_fail_percentage 28983 1726883038.24544: done checking for max_fail_percentage 28983 1726883038.24545: checking to see if all hosts have failed and the running result is not ok 28983 1726883038.24546: done checking to see if all hosts have failed 28983 1726883038.24547: getting the remaining hosts for this loop 28983 1726883038.24550: done getting the remaining hosts for this loop 28983 1726883038.24555: getting the next task for host managed_node2 28983 1726883038.24564: done getting next task for host managed_node2 28983 1726883038.24569: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 28983 1726883038.24575: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883038.24591: getting variables 28983 1726883038.24593: in VariableManager get_vars() 28983 1726883038.24632: Calling all_inventory to load vars for managed_node2 28983 1726883038.24894: Calling groups_inventory to load vars for managed_node2 28983 1726883038.24899: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883038.24909: Calling all_plugins_play to load vars for managed_node2 28983 1726883038.24923: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883038.24927: Calling groups_plugins_play to load vars for managed_node2 28983 1726883038.30976: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883038.35518: done with get_vars() 28983 1726883038.35563: done getting variables 28983 1726883038.35798: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:43:58 -0400 (0:00:01.054) 0:01:08.357 ****** 28983 1726883038.35966: entering _queue_task() for managed_node2/service 28983 1726883038.37044: worker is 1 (out of 1 available) 28983 1726883038.37058: exiting _queue_task() for managed_node2/service 28983 1726883038.37081: done queuing things up, now waiting for results queue to drain 28983 1726883038.37083: waiting for pending results... 28983 1726883038.37471: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 28983 1726883038.37507: in run() - task 0affe814-3a2d-b16d-c0a7-000000001102 28983 1726883038.37522: variable 'ansible_search_path' from source: unknown 28983 1726883038.37526: variable 'ansible_search_path' from source: unknown 28983 1726883038.37580: calling self._execute() 28983 1726883038.37704: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883038.37708: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883038.37722: variable 'omit' from source: magic vars 28983 1726883038.38294: variable 'ansible_distribution_major_version' from source: facts 28983 1726883038.38345: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883038.38493: variable 'network_provider' from source: set_fact 28983 1726883038.38520: Evaluated conditional (network_provider == "nm"): True 28983 1726883038.38636: variable '__network_wpa_supplicant_required' from source: role '' defaults 28983 1726883038.38788: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 28983 1726883038.39070: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883038.43564: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883038.43643: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883038.43678: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883038.43716: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883038.43848: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883038.43900: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883038.43930: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883038.43961: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883038.43996: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883038.44010: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883038.44065: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883038.44080: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883038.44102: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883038.44175: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883038.44186: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883038.44222: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883038.44243: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883038.44266: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883038.44304: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883038.44316: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883038.44441: variable 'network_connections' from source: include params 28983 1726883038.44454: variable 'interface' from source: play vars 28983 1726883038.44520: variable 'interface' from source: play vars 28983 1726883038.44615: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883038.44784: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883038.44818: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883038.44854: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883038.44889: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883038.44933: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883038.44955: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883038.44975: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883038.45010: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883038.45075: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883038.45303: variable 'network_connections' from source: include params 28983 1726883038.45307: variable 'interface' from source: play vars 28983 1726883038.45378: variable 'interface' from source: play vars 28983 1726883038.45446: Evaluated conditional (__network_wpa_supplicant_required): False 28983 1726883038.45450: when evaluation is False, skipping this task 28983 1726883038.45453: _execute() done 28983 1726883038.45455: dumping result to json 28983 1726883038.45457: done dumping result, returning 28983 1726883038.45482: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affe814-3a2d-b16d-c0a7-000000001102] 28983 1726883038.45493: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001102 skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 28983 1726883038.45662: no more pending results, returning what we have 28983 1726883038.45665: results queue empty 28983 1726883038.45666: checking for any_errors_fatal 28983 1726883038.45692: done checking for any_errors_fatal 28983 1726883038.45693: checking for max_fail_percentage 28983 1726883038.45696: done checking for max_fail_percentage 28983 1726883038.45697: checking to see if all hosts have failed and the running result is not ok 28983 1726883038.45698: done checking to see if all hosts have failed 28983 1726883038.45699: getting the remaining hosts for this loop 28983 1726883038.45701: done getting the remaining hosts for this loop 28983 1726883038.45706: getting the next task for host managed_node2 28983 1726883038.45714: done getting next task for host managed_node2 28983 1726883038.45719: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 28983 1726883038.45725: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883038.45748: getting variables 28983 1726883038.45750: in VariableManager get_vars() 28983 1726883038.45793: Calling all_inventory to load vars for managed_node2 28983 1726883038.45797: Calling groups_inventory to load vars for managed_node2 28983 1726883038.45799: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883038.45809: Calling all_plugins_play to load vars for managed_node2 28983 1726883038.45812: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883038.45816: Calling groups_plugins_play to load vars for managed_node2 28983 1726883038.46476: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001102 28983 1726883038.46990: WORKER PROCESS EXITING 28983 1726883038.47785: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883038.49400: done with get_vars() 28983 1726883038.49424: done getting variables 28983 1726883038.49480: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:43:58 -0400 (0:00:00.135) 0:01:08.492 ****** 28983 1726883038.49506: entering _queue_task() for managed_node2/service 28983 1726883038.49798: worker is 1 (out of 1 available) 28983 1726883038.49813: exiting _queue_task() for managed_node2/service 28983 1726883038.49828: done queuing things up, now waiting for results queue to drain 28983 1726883038.49830: waiting for pending results... 28983 1726883038.50123: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 28983 1726883038.50276: in run() - task 0affe814-3a2d-b16d-c0a7-000000001103 28983 1726883038.50294: variable 'ansible_search_path' from source: unknown 28983 1726883038.50298: variable 'ansible_search_path' from source: unknown 28983 1726883038.50348: calling self._execute() 28983 1726883038.50474: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883038.50495: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883038.50503: variable 'omit' from source: magic vars 28983 1726883038.50951: variable 'ansible_distribution_major_version' from source: facts 28983 1726883038.50963: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883038.51110: variable 'network_provider' from source: set_fact 28983 1726883038.51114: Evaluated conditional (network_provider == "initscripts"): False 28983 1726883038.51138: when evaluation is False, skipping this task 28983 1726883038.51144: _execute() done 28983 1726883038.51147: dumping result to json 28983 1726883038.51150: done dumping result, returning 28983 1726883038.51186: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [0affe814-3a2d-b16d-c0a7-000000001103] 28983 1726883038.51189: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001103 28983 1726883038.51286: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001103 28983 1726883038.51289: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28983 1726883038.51360: no more pending results, returning what we have 28983 1726883038.51364: results queue empty 28983 1726883038.51365: checking for any_errors_fatal 28983 1726883038.51372: done checking for any_errors_fatal 28983 1726883038.51373: checking for max_fail_percentage 28983 1726883038.51376: done checking for max_fail_percentage 28983 1726883038.51377: checking to see if all hosts have failed and the running result is not ok 28983 1726883038.51378: done checking to see if all hosts have failed 28983 1726883038.51379: getting the remaining hosts for this loop 28983 1726883038.51381: done getting the remaining hosts for this loop 28983 1726883038.51385: getting the next task for host managed_node2 28983 1726883038.51392: done getting next task for host managed_node2 28983 1726883038.51396: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 28983 1726883038.51403: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883038.51427: getting variables 28983 1726883038.51429: in VariableManager get_vars() 28983 1726883038.51468: Calling all_inventory to load vars for managed_node2 28983 1726883038.51471: Calling groups_inventory to load vars for managed_node2 28983 1726883038.51474: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883038.51483: Calling all_plugins_play to load vars for managed_node2 28983 1726883038.51486: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883038.51490: Calling groups_plugins_play to load vars for managed_node2 28983 1726883038.53167: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883038.55585: done with get_vars() 28983 1726883038.55626: done getting variables 28983 1726883038.55704: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:43:58 -0400 (0:00:00.062) 0:01:08.555 ****** 28983 1726883038.55747: entering _queue_task() for managed_node2/copy 28983 1726883038.55990: worker is 1 (out of 1 available) 28983 1726883038.56006: exiting _queue_task() for managed_node2/copy 28983 1726883038.56019: done queuing things up, now waiting for results queue to drain 28983 1726883038.56021: waiting for pending results... 28983 1726883038.56252: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 28983 1726883038.56351: in run() - task 0affe814-3a2d-b16d-c0a7-000000001104 28983 1726883038.56365: variable 'ansible_search_path' from source: unknown 28983 1726883038.56369: variable 'ansible_search_path' from source: unknown 28983 1726883038.56404: calling self._execute() 28983 1726883038.56520: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883038.56524: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883038.56541: variable 'omit' from source: magic vars 28983 1726883038.56866: variable 'ansible_distribution_major_version' from source: facts 28983 1726883038.56878: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883038.56979: variable 'network_provider' from source: set_fact 28983 1726883038.56983: Evaluated conditional (network_provider == "initscripts"): False 28983 1726883038.56988: when evaluation is False, skipping this task 28983 1726883038.56991: _execute() done 28983 1726883038.56995: dumping result to json 28983 1726883038.57000: done dumping result, returning 28983 1726883038.57009: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affe814-3a2d-b16d-c0a7-000000001104] 28983 1726883038.57014: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001104 28983 1726883038.57119: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001104 28983 1726883038.57122: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 28983 1726883038.57179: no more pending results, returning what we have 28983 1726883038.57183: results queue empty 28983 1726883038.57184: checking for any_errors_fatal 28983 1726883038.57190: done checking for any_errors_fatal 28983 1726883038.57191: checking for max_fail_percentage 28983 1726883038.57193: done checking for max_fail_percentage 28983 1726883038.57194: checking to see if all hosts have failed and the running result is not ok 28983 1726883038.57195: done checking to see if all hosts have failed 28983 1726883038.57196: getting the remaining hosts for this loop 28983 1726883038.57198: done getting the remaining hosts for this loop 28983 1726883038.57202: getting the next task for host managed_node2 28983 1726883038.57209: done getting next task for host managed_node2 28983 1726883038.57214: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 28983 1726883038.57219: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883038.57240: getting variables 28983 1726883038.57251: in VariableManager get_vars() 28983 1726883038.57288: Calling all_inventory to load vars for managed_node2 28983 1726883038.57290: Calling groups_inventory to load vars for managed_node2 28983 1726883038.57292: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883038.57299: Calling all_plugins_play to load vars for managed_node2 28983 1726883038.57301: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883038.57303: Calling groups_plugins_play to load vars for managed_node2 28983 1726883038.58811: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883038.61721: done with get_vars() 28983 1726883038.61762: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:43:58 -0400 (0:00:00.061) 0:01:08.616 ****** 28983 1726883038.61877: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 28983 1726883038.62122: worker is 1 (out of 1 available) 28983 1726883038.62138: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 28983 1726883038.62150: done queuing things up, now waiting for results queue to drain 28983 1726883038.62152: waiting for pending results... 28983 1726883038.62384: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 28983 1726883038.62490: in run() - task 0affe814-3a2d-b16d-c0a7-000000001105 28983 1726883038.62505: variable 'ansible_search_path' from source: unknown 28983 1726883038.62510: variable 'ansible_search_path' from source: unknown 28983 1726883038.62543: calling self._execute() 28983 1726883038.62629: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883038.62636: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883038.62645: variable 'omit' from source: magic vars 28983 1726883038.62974: variable 'ansible_distribution_major_version' from source: facts 28983 1726883038.62984: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883038.62990: variable 'omit' from source: magic vars 28983 1726883038.63060: variable 'omit' from source: magic vars 28983 1726883038.63226: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883038.65761: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883038.65840: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883038.65894: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883038.65969: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883038.65976: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883038.66042: variable 'network_provider' from source: set_fact 28983 1726883038.66203: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883038.66232: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883038.66266: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883038.66317: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883038.66338: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883038.66415: variable 'omit' from source: magic vars 28983 1726883038.66513: variable 'omit' from source: magic vars 28983 1726883038.66603: variable 'network_connections' from source: include params 28983 1726883038.66614: variable 'interface' from source: play vars 28983 1726883038.66671: variable 'interface' from source: play vars 28983 1726883038.66824: variable 'omit' from source: magic vars 28983 1726883038.66844: variable '__lsr_ansible_managed' from source: task vars 28983 1726883038.66910: variable '__lsr_ansible_managed' from source: task vars 28983 1726883038.67120: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 28983 1726883038.67377: Loaded config def from plugin (lookup/template) 28983 1726883038.67381: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 28983 1726883038.67424: File lookup term: get_ansible_managed.j2 28983 1726883038.67428: variable 'ansible_search_path' from source: unknown 28983 1726883038.67432: evaluation_path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 28983 1726883038.67452: search_path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 28983 1726883038.67459: variable 'ansible_search_path' from source: unknown 28983 1726883038.78723: variable 'ansible_managed' from source: unknown 28983 1726883038.78924: variable 'omit' from source: magic vars 28983 1726883038.78945: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883038.78994: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883038.78997: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883038.79032: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883038.79038: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883038.79103: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883038.79108: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883038.79111: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883038.79171: Set connection var ansible_connection to ssh 28983 1726883038.79183: Set connection var ansible_shell_executable to /bin/sh 28983 1726883038.79192: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883038.79202: Set connection var ansible_timeout to 10 28983 1726883038.79208: Set connection var ansible_pipelining to False 28983 1726883038.79211: Set connection var ansible_shell_type to sh 28983 1726883038.79252: variable 'ansible_shell_executable' from source: unknown 28983 1726883038.79255: variable 'ansible_connection' from source: unknown 28983 1726883038.79258: variable 'ansible_module_compression' from source: unknown 28983 1726883038.79261: variable 'ansible_shell_type' from source: unknown 28983 1726883038.79264: variable 'ansible_shell_executable' from source: unknown 28983 1726883038.79266: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883038.79283: variable 'ansible_pipelining' from source: unknown 28983 1726883038.79286: variable 'ansible_timeout' from source: unknown 28983 1726883038.79288: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883038.79416: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28983 1726883038.79437: variable 'omit' from source: magic vars 28983 1726883038.79440: starting attempt loop 28983 1726883038.79443: running the handler 28983 1726883038.79480: _low_level_execute_command(): starting 28983 1726883038.79483: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726883038.80089: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883038.80094: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883038.80098: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883038.80100: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883038.80158: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883038.80161: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883038.80252: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883038.82026: stdout chunk (state=3): >>>/root <<< 28983 1726883038.82141: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883038.82241: stderr chunk (state=3): >>><<< 28983 1726883038.82244: stdout chunk (state=3): >>><<< 28983 1726883038.82281: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883038.82288: _low_level_execute_command(): starting 28983 1726883038.82299: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726883038.8227577-31380-100130068680238 `" && echo ansible-tmp-1726883038.8227577-31380-100130068680238="` echo /root/.ansible/tmp/ansible-tmp-1726883038.8227577-31380-100130068680238 `" ) && sleep 0' 28983 1726883038.82826: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883038.82829: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883038.82832: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883038.82836: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883038.82911: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883038.83007: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883038.85031: stdout chunk (state=3): >>>ansible-tmp-1726883038.8227577-31380-100130068680238=/root/.ansible/tmp/ansible-tmp-1726883038.8227577-31380-100130068680238 <<< 28983 1726883038.85156: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883038.85203: stderr chunk (state=3): >>><<< 28983 1726883038.85207: stdout chunk (state=3): >>><<< 28983 1726883038.85220: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726883038.8227577-31380-100130068680238=/root/.ansible/tmp/ansible-tmp-1726883038.8227577-31380-100130068680238 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883038.85257: variable 'ansible_module_compression' from source: unknown 28983 1726883038.85299: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 28983 1726883038.85337: variable 'ansible_facts' from source: unknown 28983 1726883038.85431: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726883038.8227577-31380-100130068680238/AnsiballZ_network_connections.py 28983 1726883038.85544: Sending initial data 28983 1726883038.85548: Sent initial data (168 bytes) 28983 1726883038.86146: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883038.86155: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883038.86166: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883038.86191: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883038.86205: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883038.86300: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883038.88001: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726883038.88080: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726883038.88159: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpys1083au /root/.ansible/tmp/ansible-tmp-1726883038.8227577-31380-100130068680238/AnsiballZ_network_connections.py <<< 28983 1726883038.88190: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726883038.8227577-31380-100130068680238/AnsiballZ_network_connections.py" <<< 28983 1726883038.88327: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpys1083au" to remote "/root/.ansible/tmp/ansible-tmp-1726883038.8227577-31380-100130068680238/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726883038.8227577-31380-100130068680238/AnsiballZ_network_connections.py" <<< 28983 1726883038.89894: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883038.89965: stderr chunk (state=3): >>><<< 28983 1726883038.89978: stdout chunk (state=3): >>><<< 28983 1726883038.90098: done transferring module to remote 28983 1726883038.90101: _low_level_execute_command(): starting 28983 1726883038.90104: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726883038.8227577-31380-100130068680238/ /root/.ansible/tmp/ansible-tmp-1726883038.8227577-31380-100130068680238/AnsiballZ_network_connections.py && sleep 0' 28983 1726883038.90648: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883038.90664: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883038.90783: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883038.90803: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883038.91059: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883038.91137: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883038.93051: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883038.93093: stderr chunk (state=3): >>><<< 28983 1726883038.93101: stdout chunk (state=3): >>><<< 28983 1726883038.93122: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883038.93128: _low_level_execute_command(): starting 28983 1726883038.93136: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726883038.8227577-31380-100130068680238/AnsiballZ_network_connections.py && sleep 0' 28983 1726883038.93584: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883038.93587: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883038.93590: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883038.93593: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883038.93640: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883038.93657: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883038.93730: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883039.26201: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 3ac79eb6-77ee-484f-9752-0ce3ea88e423\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 28983 1726883039.28157: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. <<< 28983 1726883039.28221: stderr chunk (state=3): >>><<< 28983 1726883039.28241: stdout chunk (state=3): >>><<< 28983 1726883039.28341: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 3ac79eb6-77ee-484f-9752-0ce3ea88e423\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726883039.28344: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'persistent_state': 'present', 'type': 'bridge', 'ip': {'dhcp4': False, 'auto6': False}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726883038.8227577-31380-100130068680238/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726883039.28347: _low_level_execute_command(): starting 28983 1726883039.28349: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726883038.8227577-31380-100130068680238/ > /dev/null 2>&1 && sleep 0' 28983 1726883039.28877: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address <<< 28983 1726883039.28882: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883039.29060: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883039.31193: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883039.31197: stdout chunk (state=3): >>><<< 28983 1726883039.31199: stderr chunk (state=3): >>><<< 28983 1726883039.31267: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883039.31336: handler run complete 28983 1726883039.31443: attempt loop complete, returning result 28983 1726883039.31448: _execute() done 28983 1726883039.31451: dumping result to json 28983 1726883039.31608: done dumping result, returning 28983 1726883039.31612: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affe814-3a2d-b16d-c0a7-000000001105] 28983 1726883039.31615: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001105 28983 1726883039.31888: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001105 28983 1726883039.31892: WORKER PROCESS EXITING changed: [managed_node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 3ac79eb6-77ee-484f-9752-0ce3ea88e423 28983 1726883039.32195: no more pending results, returning what we have 28983 1726883039.32199: results queue empty 28983 1726883039.32200: checking for any_errors_fatal 28983 1726883039.32210: done checking for any_errors_fatal 28983 1726883039.32211: checking for max_fail_percentage 28983 1726883039.32213: done checking for max_fail_percentage 28983 1726883039.32218: checking to see if all hosts have failed and the running result is not ok 28983 1726883039.32219: done checking to see if all hosts have failed 28983 1726883039.32220: getting the remaining hosts for this loop 28983 1726883039.32222: done getting the remaining hosts for this loop 28983 1726883039.32226: getting the next task for host managed_node2 28983 1726883039.32315: done getting next task for host managed_node2 28983 1726883039.32321: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 28983 1726883039.32326: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883039.32343: getting variables 28983 1726883039.32345: in VariableManager get_vars() 28983 1726883039.32386: Calling all_inventory to load vars for managed_node2 28983 1726883039.32390: Calling groups_inventory to load vars for managed_node2 28983 1726883039.32393: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883039.32403: Calling all_plugins_play to load vars for managed_node2 28983 1726883039.32407: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883039.32411: Calling groups_plugins_play to load vars for managed_node2 28983 1726883039.35191: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883039.38240: done with get_vars() 28983 1726883039.38277: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:43:59 -0400 (0:00:00.765) 0:01:09.381 ****** 28983 1726883039.38422: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 28983 1726883039.39164: worker is 1 (out of 1 available) 28983 1726883039.39175: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 28983 1726883039.39188: done queuing things up, now waiting for results queue to drain 28983 1726883039.39190: waiting for pending results... 28983 1726883039.39524: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 28983 1726883039.39577: in run() - task 0affe814-3a2d-b16d-c0a7-000000001106 28983 1726883039.39620: variable 'ansible_search_path' from source: unknown 28983 1726883039.39637: variable 'ansible_search_path' from source: unknown 28983 1726883039.39690: calling self._execute() 28983 1726883039.39836: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883039.39853: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883039.39947: variable 'omit' from source: magic vars 28983 1726883039.40420: variable 'ansible_distribution_major_version' from source: facts 28983 1726883039.40448: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883039.40632: variable 'network_state' from source: role '' defaults 28983 1726883039.40652: Evaluated conditional (network_state != {}): False 28983 1726883039.40659: when evaluation is False, skipping this task 28983 1726883039.40666: _execute() done 28983 1726883039.40672: dumping result to json 28983 1726883039.40681: done dumping result, returning 28983 1726883039.40692: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [0affe814-3a2d-b16d-c0a7-000000001106] 28983 1726883039.40708: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001106 skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28983 1726883039.40918: no more pending results, returning what we have 28983 1726883039.40924: results queue empty 28983 1726883039.40925: checking for any_errors_fatal 28983 1726883039.40943: done checking for any_errors_fatal 28983 1726883039.40944: checking for max_fail_percentage 28983 1726883039.40947: done checking for max_fail_percentage 28983 1726883039.40948: checking to see if all hosts have failed and the running result is not ok 28983 1726883039.40949: done checking to see if all hosts have failed 28983 1726883039.40950: getting the remaining hosts for this loop 28983 1726883039.40952: done getting the remaining hosts for this loop 28983 1726883039.40960: getting the next task for host managed_node2 28983 1726883039.40969: done getting next task for host managed_node2 28983 1726883039.40973: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 28983 1726883039.40980: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883039.41006: getting variables 28983 1726883039.41008: in VariableManager get_vars() 28983 1726883039.41271: Calling all_inventory to load vars for managed_node2 28983 1726883039.41274: Calling groups_inventory to load vars for managed_node2 28983 1726883039.41277: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883039.41285: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001106 28983 1726883039.41288: WORKER PROCESS EXITING 28983 1726883039.41298: Calling all_plugins_play to load vars for managed_node2 28983 1726883039.41302: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883039.41306: Calling groups_plugins_play to load vars for managed_node2 28983 1726883039.44214: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883039.47275: done with get_vars() 28983 1726883039.47311: done getting variables 28983 1726883039.47382: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:43:59 -0400 (0:00:00.090) 0:01:09.471 ****** 28983 1726883039.47422: entering _queue_task() for managed_node2/debug 28983 1726883039.47769: worker is 1 (out of 1 available) 28983 1726883039.47782: exiting _queue_task() for managed_node2/debug 28983 1726883039.47794: done queuing things up, now waiting for results queue to drain 28983 1726883039.47796: waiting for pending results... 28983 1726883039.48116: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 28983 1726883039.48344: in run() - task 0affe814-3a2d-b16d-c0a7-000000001107 28983 1726883039.48360: variable 'ansible_search_path' from source: unknown 28983 1726883039.48441: variable 'ansible_search_path' from source: unknown 28983 1726883039.48449: calling self._execute() 28983 1726883039.48561: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883039.48579: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883039.48608: variable 'omit' from source: magic vars 28983 1726883039.49100: variable 'ansible_distribution_major_version' from source: facts 28983 1726883039.49340: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883039.49344: variable 'omit' from source: magic vars 28983 1726883039.49347: variable 'omit' from source: magic vars 28983 1726883039.49350: variable 'omit' from source: magic vars 28983 1726883039.49353: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883039.49385: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883039.49413: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883039.49444: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883039.49464: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883039.49505: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883039.49513: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883039.49522: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883039.49644: Set connection var ansible_connection to ssh 28983 1726883039.49662: Set connection var ansible_shell_executable to /bin/sh 28983 1726883039.49676: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883039.49699: Set connection var ansible_timeout to 10 28983 1726883039.49799: Set connection var ansible_pipelining to False 28983 1726883039.49802: Set connection var ansible_shell_type to sh 28983 1726883039.49805: variable 'ansible_shell_executable' from source: unknown 28983 1726883039.49807: variable 'ansible_connection' from source: unknown 28983 1726883039.49809: variable 'ansible_module_compression' from source: unknown 28983 1726883039.49811: variable 'ansible_shell_type' from source: unknown 28983 1726883039.49813: variable 'ansible_shell_executable' from source: unknown 28983 1726883039.49815: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883039.49817: variable 'ansible_pipelining' from source: unknown 28983 1726883039.49819: variable 'ansible_timeout' from source: unknown 28983 1726883039.49821: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883039.49970: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883039.49990: variable 'omit' from source: magic vars 28983 1726883039.50001: starting attempt loop 28983 1726883039.50012: running the handler 28983 1726883039.50179: variable '__network_connections_result' from source: set_fact 28983 1726883039.50256: handler run complete 28983 1726883039.50285: attempt loop complete, returning result 28983 1726883039.50293: _execute() done 28983 1726883039.50301: dumping result to json 28983 1726883039.50309: done dumping result, returning 28983 1726883039.50327: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affe814-3a2d-b16d-c0a7-000000001107] 28983 1726883039.50344: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001107 28983 1726883039.50533: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001107 28983 1726883039.50538: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 3ac79eb6-77ee-484f-9752-0ce3ea88e423" ] } 28983 1726883039.50639: no more pending results, returning what we have 28983 1726883039.50644: results queue empty 28983 1726883039.50645: checking for any_errors_fatal 28983 1726883039.50654: done checking for any_errors_fatal 28983 1726883039.50655: checking for max_fail_percentage 28983 1726883039.50658: done checking for max_fail_percentage 28983 1726883039.50659: checking to see if all hosts have failed and the running result is not ok 28983 1726883039.50660: done checking to see if all hosts have failed 28983 1726883039.50661: getting the remaining hosts for this loop 28983 1726883039.50663: done getting the remaining hosts for this loop 28983 1726883039.50669: getting the next task for host managed_node2 28983 1726883039.50678: done getting next task for host managed_node2 28983 1726883039.50683: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 28983 1726883039.50691: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883039.50706: getting variables 28983 1726883039.50708: in VariableManager get_vars() 28983 1726883039.50872: Calling all_inventory to load vars for managed_node2 28983 1726883039.50876: Calling groups_inventory to load vars for managed_node2 28983 1726883039.50880: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883039.50891: Calling all_plugins_play to load vars for managed_node2 28983 1726883039.50895: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883039.50899: Calling groups_plugins_play to load vars for managed_node2 28983 1726883039.53757: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883039.58778: done with get_vars() 28983 1726883039.58833: done getting variables 28983 1726883039.59030: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:43:59 -0400 (0:00:00.116) 0:01:09.588 ****** 28983 1726883039.59084: entering _queue_task() for managed_node2/debug 28983 1726883039.59508: worker is 1 (out of 1 available) 28983 1726883039.59524: exiting _queue_task() for managed_node2/debug 28983 1726883039.59942: done queuing things up, now waiting for results queue to drain 28983 1726883039.59944: waiting for pending results... 28983 1726883039.60354: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 28983 1726883039.60572: in run() - task 0affe814-3a2d-b16d-c0a7-000000001108 28983 1726883039.60597: variable 'ansible_search_path' from source: unknown 28983 1726883039.60739: variable 'ansible_search_path' from source: unknown 28983 1726883039.60743: calling self._execute() 28983 1726883039.60995: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883039.60998: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883039.61001: variable 'omit' from source: magic vars 28983 1726883039.62330: variable 'ansible_distribution_major_version' from source: facts 28983 1726883039.62412: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883039.62424: variable 'omit' from source: magic vars 28983 1726883039.62638: variable 'omit' from source: magic vars 28983 1726883039.62772: variable 'omit' from source: magic vars 28983 1726883039.62824: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883039.62933: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883039.62965: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883039.63039: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883039.63043: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883039.63051: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883039.63061: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883039.63070: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883039.63196: Set connection var ansible_connection to ssh 28983 1726883039.63214: Set connection var ansible_shell_executable to /bin/sh 28983 1726883039.63228: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883039.63247: Set connection var ansible_timeout to 10 28983 1726883039.63264: Set connection var ansible_pipelining to False 28983 1726883039.63440: Set connection var ansible_shell_type to sh 28983 1726883039.63443: variable 'ansible_shell_executable' from source: unknown 28983 1726883039.63446: variable 'ansible_connection' from source: unknown 28983 1726883039.63448: variable 'ansible_module_compression' from source: unknown 28983 1726883039.63451: variable 'ansible_shell_type' from source: unknown 28983 1726883039.63453: variable 'ansible_shell_executable' from source: unknown 28983 1726883039.63455: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883039.63458: variable 'ansible_pipelining' from source: unknown 28983 1726883039.63460: variable 'ansible_timeout' from source: unknown 28983 1726883039.63462: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883039.63527: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883039.63549: variable 'omit' from source: magic vars 28983 1726883039.63559: starting attempt loop 28983 1726883039.63567: running the handler 28983 1726883039.63628: variable '__network_connections_result' from source: set_fact 28983 1726883039.63737: variable '__network_connections_result' from source: set_fact 28983 1726883039.63899: handler run complete 28983 1726883039.63948: attempt loop complete, returning result 28983 1726883039.64016: _execute() done 28983 1726883039.64020: dumping result to json 28983 1726883039.64023: done dumping result, returning 28983 1726883039.64026: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affe814-3a2d-b16d-c0a7-000000001108] 28983 1726883039.64029: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001108 ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 3ac79eb6-77ee-484f-9752-0ce3ea88e423\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 3ac79eb6-77ee-484f-9752-0ce3ea88e423" ] } } 28983 1726883039.64354: no more pending results, returning what we have 28983 1726883039.64359: results queue empty 28983 1726883039.64360: checking for any_errors_fatal 28983 1726883039.64368: done checking for any_errors_fatal 28983 1726883039.64369: checking for max_fail_percentage 28983 1726883039.64372: done checking for max_fail_percentage 28983 1726883039.64373: checking to see if all hosts have failed and the running result is not ok 28983 1726883039.64374: done checking to see if all hosts have failed 28983 1726883039.64375: getting the remaining hosts for this loop 28983 1726883039.64377: done getting the remaining hosts for this loop 28983 1726883039.64383: getting the next task for host managed_node2 28983 1726883039.64393: done getting next task for host managed_node2 28983 1726883039.64399: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 28983 1726883039.64405: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883039.64421: getting variables 28983 1726883039.64423: in VariableManager get_vars() 28983 1726883039.64676: Calling all_inventory to load vars for managed_node2 28983 1726883039.64680: Calling groups_inventory to load vars for managed_node2 28983 1726883039.64690: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883039.64699: Calling all_plugins_play to load vars for managed_node2 28983 1726883039.64705: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883039.64709: Calling groups_plugins_play to load vars for managed_node2 28983 1726883039.65596: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001108 28983 1726883039.65600: WORKER PROCESS EXITING 28983 1726883039.77249: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883039.84038: done with get_vars() 28983 1726883039.84218: done getting variables 28983 1726883039.84353: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:43:59 -0400 (0:00:00.253) 0:01:09.841 ****** 28983 1726883039.84401: entering _queue_task() for managed_node2/debug 28983 1726883039.84894: worker is 1 (out of 1 available) 28983 1726883039.84908: exiting _queue_task() for managed_node2/debug 28983 1726883039.84981: done queuing things up, now waiting for results queue to drain 28983 1726883039.84984: waiting for pending results... 28983 1726883039.85242: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 28983 1726883039.85446: in run() - task 0affe814-3a2d-b16d-c0a7-000000001109 28983 1726883039.85462: variable 'ansible_search_path' from source: unknown 28983 1726883039.85467: variable 'ansible_search_path' from source: unknown 28983 1726883039.85516: calling self._execute() 28983 1726883039.85647: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883039.85656: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883039.85667: variable 'omit' from source: magic vars 28983 1726883039.86348: variable 'ansible_distribution_major_version' from source: facts 28983 1726883039.86436: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883039.86617: variable 'network_state' from source: role '' defaults 28983 1726883039.86631: Evaluated conditional (network_state != {}): False 28983 1726883039.86635: when evaluation is False, skipping this task 28983 1726883039.87048: _execute() done 28983 1726883039.87053: dumping result to json 28983 1726883039.87056: done dumping result, returning 28983 1726883039.87059: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affe814-3a2d-b16d-c0a7-000000001109] 28983 1726883039.87062: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001109 28983 1726883039.87233: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001109 28983 1726883039.87240: WORKER PROCESS EXITING skipping: [managed_node2] => { "false_condition": "network_state != {}" } 28983 1726883039.87305: no more pending results, returning what we have 28983 1726883039.87310: results queue empty 28983 1726883039.87311: checking for any_errors_fatal 28983 1726883039.87328: done checking for any_errors_fatal 28983 1726883039.87329: checking for max_fail_percentage 28983 1726883039.87331: done checking for max_fail_percentage 28983 1726883039.87333: checking to see if all hosts have failed and the running result is not ok 28983 1726883039.87341: done checking to see if all hosts have failed 28983 1726883039.87343: getting the remaining hosts for this loop 28983 1726883039.87349: done getting the remaining hosts for this loop 28983 1726883039.87356: getting the next task for host managed_node2 28983 1726883039.87366: done getting next task for host managed_node2 28983 1726883039.87371: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 28983 1726883039.87382: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883039.87410: getting variables 28983 1726883039.87412: in VariableManager get_vars() 28983 1726883039.87762: Calling all_inventory to load vars for managed_node2 28983 1726883039.87766: Calling groups_inventory to load vars for managed_node2 28983 1726883039.87769: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883039.87781: Calling all_plugins_play to load vars for managed_node2 28983 1726883039.87785: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883039.87790: Calling groups_plugins_play to load vars for managed_node2 28983 1726883039.90998: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883039.97336: done with get_vars() 28983 1726883039.97385: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:43:59 -0400 (0:00:00.131) 0:01:09.972 ****** 28983 1726883039.97518: entering _queue_task() for managed_node2/ping 28983 1726883039.97954: worker is 1 (out of 1 available) 28983 1726883039.97968: exiting _queue_task() for managed_node2/ping 28983 1726883039.97986: done queuing things up, now waiting for results queue to drain 28983 1726883039.97988: waiting for pending results... 28983 1726883039.98439: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 28983 1726883039.98517: in run() - task 0affe814-3a2d-b16d-c0a7-00000000110a 28983 1726883039.98533: variable 'ansible_search_path' from source: unknown 28983 1726883039.98539: variable 'ansible_search_path' from source: unknown 28983 1726883039.98591: calling self._execute() 28983 1726883039.98841: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883039.98845: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883039.98847: variable 'omit' from source: magic vars 28983 1726883039.99897: variable 'ansible_distribution_major_version' from source: facts 28983 1726883039.99910: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883039.99918: variable 'omit' from source: magic vars 28983 1726883040.00252: variable 'omit' from source: magic vars 28983 1726883040.00439: variable 'omit' from source: magic vars 28983 1726883040.00505: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883040.00570: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883040.00605: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883040.00618: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883040.00632: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883040.00670: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883040.00674: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883040.00683: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883040.01355: Set connection var ansible_connection to ssh 28983 1726883040.01359: Set connection var ansible_shell_executable to /bin/sh 28983 1726883040.01363: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883040.01366: Set connection var ansible_timeout to 10 28983 1726883040.01369: Set connection var ansible_pipelining to False 28983 1726883040.01374: Set connection var ansible_shell_type to sh 28983 1726883040.01377: variable 'ansible_shell_executable' from source: unknown 28983 1726883040.01382: variable 'ansible_connection' from source: unknown 28983 1726883040.01386: variable 'ansible_module_compression' from source: unknown 28983 1726883040.01389: variable 'ansible_shell_type' from source: unknown 28983 1726883040.01391: variable 'ansible_shell_executable' from source: unknown 28983 1726883040.01394: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883040.01398: variable 'ansible_pipelining' from source: unknown 28983 1726883040.01401: variable 'ansible_timeout' from source: unknown 28983 1726883040.01468: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883040.01993: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28983 1726883040.02000: variable 'omit' from source: magic vars 28983 1726883040.02003: starting attempt loop 28983 1726883040.02006: running the handler 28983 1726883040.02344: _low_level_execute_command(): starting 28983 1726883040.02348: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726883040.03053: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883040.03137: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883040.03256: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883040.03461: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883040.05290: stdout chunk (state=3): >>>/root <<< 28983 1726883040.05476: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883040.05488: stdout chunk (state=3): >>><<< 28983 1726883040.05494: stderr chunk (state=3): >>><<< 28983 1726883040.05641: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883040.05654: _low_level_execute_command(): starting 28983 1726883040.05663: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726883040.0563977-31442-174872307369065 `" && echo ansible-tmp-1726883040.0563977-31442-174872307369065="` echo /root/.ansible/tmp/ansible-tmp-1726883040.0563977-31442-174872307369065 `" ) && sleep 0' 28983 1726883040.07430: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883040.07435: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883040.07554: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883040.07674: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883040.09787: stdout chunk (state=3): >>>ansible-tmp-1726883040.0563977-31442-174872307369065=/root/.ansible/tmp/ansible-tmp-1726883040.0563977-31442-174872307369065 <<< 28983 1726883040.09958: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883040.10158: stderr chunk (state=3): >>><<< 28983 1726883040.10162: stdout chunk (state=3): >>><<< 28983 1726883040.10165: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726883040.0563977-31442-174872307369065=/root/.ansible/tmp/ansible-tmp-1726883040.0563977-31442-174872307369065 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883040.10194: variable 'ansible_module_compression' from source: unknown 28983 1726883040.10313: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 28983 1726883040.10542: variable 'ansible_facts' from source: unknown 28983 1726883040.10586: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726883040.0563977-31442-174872307369065/AnsiballZ_ping.py 28983 1726883040.11056: Sending initial data 28983 1726883040.11060: Sent initial data (153 bytes) 28983 1726883040.12343: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883040.12353: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883040.12356: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726883040.12358: stderr chunk (state=3): >>>debug2: match found <<< 28983 1726883040.12361: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883040.12494: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883040.12540: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883040.12731: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883040.14459: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726883040.14609: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726883040.14686: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpiavxj2g9 /root/.ansible/tmp/ansible-tmp-1726883040.0563977-31442-174872307369065/AnsiballZ_ping.py <<< 28983 1726883040.14690: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726883040.0563977-31442-174872307369065/AnsiballZ_ping.py" <<< 28983 1726883040.14769: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpiavxj2g9" to remote "/root/.ansible/tmp/ansible-tmp-1726883040.0563977-31442-174872307369065/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726883040.0563977-31442-174872307369065/AnsiballZ_ping.py" <<< 28983 1726883040.16275: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883040.16509: stderr chunk (state=3): >>><<< 28983 1726883040.16513: stdout chunk (state=3): >>><<< 28983 1726883040.16516: done transferring module to remote 28983 1726883040.16520: _low_level_execute_command(): starting 28983 1726883040.16525: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726883040.0563977-31442-174872307369065/ /root/.ansible/tmp/ansible-tmp-1726883040.0563977-31442-174872307369065/AnsiballZ_ping.py && sleep 0' 28983 1726883040.17151: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883040.17180: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883040.17192: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883040.17218: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883040.17365: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883040.19470: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883040.19477: stderr chunk (state=3): >>><<< 28983 1726883040.19582: stdout chunk (state=3): >>><<< 28983 1726883040.19585: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883040.19593: _low_level_execute_command(): starting 28983 1726883040.19596: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726883040.0563977-31442-174872307369065/AnsiballZ_ping.py && sleep 0' 28983 1726883040.20379: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883040.20383: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883040.20403: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883040.20424: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883040.20464: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883040.20522: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883040.20567: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883040.20572: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883040.20591: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883040.20710: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883040.38059: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 28983 1726883040.39870: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. <<< 28983 1726883040.39877: stdout chunk (state=3): >>><<< 28983 1726883040.39880: stderr chunk (state=3): >>><<< 28983 1726883040.39884: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726883040.39888: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726883040.0563977-31442-174872307369065/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726883040.39891: _low_level_execute_command(): starting 28983 1726883040.39894: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726883040.0563977-31442-174872307369065/ > /dev/null 2>&1 && sleep 0' 28983 1726883040.40977: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883040.41251: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 28983 1726883040.41454: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883040.41561: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883040.43589: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883040.43656: stderr chunk (state=3): >>><<< 28983 1726883040.43670: stdout chunk (state=3): >>><<< 28983 1726883040.43697: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883040.43724: handler run complete 28983 1726883040.43781: attempt loop complete, returning result 28983 1726883040.43813: _execute() done 28983 1726883040.44042: dumping result to json 28983 1726883040.44045: done dumping result, returning 28983 1726883040.44047: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affe814-3a2d-b16d-c0a7-00000000110a] 28983 1726883040.44050: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000110a 28983 1726883040.44131: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000110a ok: [managed_node2] => { "changed": false, "ping": "pong" } 28983 1726883040.44215: no more pending results, returning what we have 28983 1726883040.44356: results queue empty 28983 1726883040.44358: checking for any_errors_fatal 28983 1726883040.44367: done checking for any_errors_fatal 28983 1726883040.44368: checking for max_fail_percentage 28983 1726883040.44370: done checking for max_fail_percentage 28983 1726883040.44372: checking to see if all hosts have failed and the running result is not ok 28983 1726883040.44373: done checking to see if all hosts have failed 28983 1726883040.44374: getting the remaining hosts for this loop 28983 1726883040.44376: done getting the remaining hosts for this loop 28983 1726883040.44382: getting the next task for host managed_node2 28983 1726883040.44394: done getting next task for host managed_node2 28983 1726883040.44398: ^ task is: TASK: meta (role_complete) 28983 1726883040.44404: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883040.44421: getting variables 28983 1726883040.44423: in VariableManager get_vars() 28983 1726883040.44224: WORKER PROCESS EXITING 28983 1726883040.44674: Calling all_inventory to load vars for managed_node2 28983 1726883040.44709: Calling groups_inventory to load vars for managed_node2 28983 1726883040.44713: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883040.44724: Calling all_plugins_play to load vars for managed_node2 28983 1726883040.44728: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883040.44732: Calling groups_plugins_play to load vars for managed_node2 28983 1726883040.47821: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883040.51207: done with get_vars() 28983 1726883040.51248: done getting variables 28983 1726883040.51359: done queuing things up, now waiting for results queue to drain 28983 1726883040.51361: results queue empty 28983 1726883040.51362: checking for any_errors_fatal 28983 1726883040.51366: done checking for any_errors_fatal 28983 1726883040.51367: checking for max_fail_percentage 28983 1726883040.51368: done checking for max_fail_percentage 28983 1726883040.51369: checking to see if all hosts have failed and the running result is not ok 28983 1726883040.51370: done checking to see if all hosts have failed 28983 1726883040.51371: getting the remaining hosts for this loop 28983 1726883040.51379: done getting the remaining hosts for this loop 28983 1726883040.51383: getting the next task for host managed_node2 28983 1726883040.51389: done getting next task for host managed_node2 28983 1726883040.51392: ^ task is: TASK: Show result 28983 1726883040.51395: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883040.51398: getting variables 28983 1726883040.51400: in VariableManager get_vars() 28983 1726883040.51463: Calling all_inventory to load vars for managed_node2 28983 1726883040.51466: Calling groups_inventory to load vars for managed_node2 28983 1726883040.51470: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883040.51476: Calling all_plugins_play to load vars for managed_node2 28983 1726883040.51479: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883040.51491: Calling groups_plugins_play to load vars for managed_node2 28983 1726883040.53828: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883040.58447: done with get_vars() 28983 1726883040.58487: done getting variables 28983 1726883040.58552: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show result] ************************************************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml:14 Friday 20 September 2024 21:44:00 -0400 (0:00:00.610) 0:01:10.583 ****** 28983 1726883040.58595: entering _queue_task() for managed_node2/debug 28983 1726883040.59158: worker is 1 (out of 1 available) 28983 1726883040.59173: exiting _queue_task() for managed_node2/debug 28983 1726883040.59186: done queuing things up, now waiting for results queue to drain 28983 1726883040.59188: waiting for pending results... 28983 1726883040.59487: running TaskExecutor() for managed_node2/TASK: Show result 28983 1726883040.59655: in run() - task 0affe814-3a2d-b16d-c0a7-000000001090 28983 1726883040.59680: variable 'ansible_search_path' from source: unknown 28983 1726883040.59685: variable 'ansible_search_path' from source: unknown 28983 1726883040.59816: calling self._execute() 28983 1726883040.59855: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883040.59865: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883040.59882: variable 'omit' from source: magic vars 28983 1726883040.60392: variable 'ansible_distribution_major_version' from source: facts 28983 1726883040.60406: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883040.60413: variable 'omit' from source: magic vars 28983 1726883040.60490: variable 'omit' from source: magic vars 28983 1726883040.60580: variable 'omit' from source: magic vars 28983 1726883040.60594: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883040.60799: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883040.60803: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883040.60807: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883040.60810: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883040.60814: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883040.60817: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883040.60820: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883040.60893: Set connection var ansible_connection to ssh 28983 1726883040.60906: Set connection var ansible_shell_executable to /bin/sh 28983 1726883040.60923: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883040.60936: Set connection var ansible_timeout to 10 28983 1726883040.60945: Set connection var ansible_pipelining to False 28983 1726883040.60948: Set connection var ansible_shell_type to sh 28983 1726883040.60987: variable 'ansible_shell_executable' from source: unknown 28983 1726883040.60990: variable 'ansible_connection' from source: unknown 28983 1726883040.60994: variable 'ansible_module_compression' from source: unknown 28983 1726883040.60999: variable 'ansible_shell_type' from source: unknown 28983 1726883040.61001: variable 'ansible_shell_executable' from source: unknown 28983 1726883040.61006: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883040.61014: variable 'ansible_pipelining' from source: unknown 28983 1726883040.61016: variable 'ansible_timeout' from source: unknown 28983 1726883040.61025: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883040.61211: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883040.61225: variable 'omit' from source: magic vars 28983 1726883040.61230: starting attempt loop 28983 1726883040.61241: running the handler 28983 1726883040.61453: variable '__network_connections_result' from source: set_fact 28983 1726883040.61457: variable '__network_connections_result' from source: set_fact 28983 1726883040.61639: handler run complete 28983 1726883040.61643: attempt loop complete, returning result 28983 1726883040.61645: _execute() done 28983 1726883040.61648: dumping result to json 28983 1726883040.61650: done dumping result, returning 28983 1726883040.61653: done running TaskExecutor() for managed_node2/TASK: Show result [0affe814-3a2d-b16d-c0a7-000000001090] 28983 1726883040.61655: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001090 28983 1726883040.61744: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001090 28983 1726883040.61748: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 3ac79eb6-77ee-484f-9752-0ce3ea88e423\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 3ac79eb6-77ee-484f-9752-0ce3ea88e423" ] } } 28983 1726883040.61843: no more pending results, returning what we have 28983 1726883040.61847: results queue empty 28983 1726883040.61848: checking for any_errors_fatal 28983 1726883040.61851: done checking for any_errors_fatal 28983 1726883040.61852: checking for max_fail_percentage 28983 1726883040.61854: done checking for max_fail_percentage 28983 1726883040.61856: checking to see if all hosts have failed and the running result is not ok 28983 1726883040.61857: done checking to see if all hosts have failed 28983 1726883040.61858: getting the remaining hosts for this loop 28983 1726883040.61860: done getting the remaining hosts for this loop 28983 1726883040.61865: getting the next task for host managed_node2 28983 1726883040.61879: done getting next task for host managed_node2 28983 1726883040.61884: ^ task is: TASK: Include network role 28983 1726883040.61889: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883040.61896: getting variables 28983 1726883040.61898: in VariableManager get_vars() 28983 1726883040.62154: Calling all_inventory to load vars for managed_node2 28983 1726883040.62158: Calling groups_inventory to load vars for managed_node2 28983 1726883040.62162: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883040.62172: Calling all_plugins_play to load vars for managed_node2 28983 1726883040.62176: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883040.62180: Calling groups_plugins_play to load vars for managed_node2 28983 1726883040.64737: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883040.68002: done with get_vars() 28983 1726883040.68049: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml:3 Friday 20 September 2024 21:44:00 -0400 (0:00:00.095) 0:01:10.679 ****** 28983 1726883040.68166: entering _queue_task() for managed_node2/include_role 28983 1726883040.68630: worker is 1 (out of 1 available) 28983 1726883040.68646: exiting _queue_task() for managed_node2/include_role 28983 1726883040.68659: done queuing things up, now waiting for results queue to drain 28983 1726883040.68661: waiting for pending results... 28983 1726883040.68892: running TaskExecutor() for managed_node2/TASK: Include network role 28983 1726883040.69240: in run() - task 0affe814-3a2d-b16d-c0a7-000000001094 28983 1726883040.69246: variable 'ansible_search_path' from source: unknown 28983 1726883040.69250: variable 'ansible_search_path' from source: unknown 28983 1726883040.69253: calling self._execute() 28983 1726883040.69255: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883040.69259: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883040.69440: variable 'omit' from source: magic vars 28983 1726883040.69776: variable 'ansible_distribution_major_version' from source: facts 28983 1726883040.69787: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883040.69795: _execute() done 28983 1726883040.69799: dumping result to json 28983 1726883040.69802: done dumping result, returning 28983 1726883040.69810: done running TaskExecutor() for managed_node2/TASK: Include network role [0affe814-3a2d-b16d-c0a7-000000001094] 28983 1726883040.69830: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001094 28983 1726883040.70140: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001094 28983 1726883040.70144: WORKER PROCESS EXITING 28983 1726883040.70178: no more pending results, returning what we have 28983 1726883040.70183: in VariableManager get_vars() 28983 1726883040.70221: Calling all_inventory to load vars for managed_node2 28983 1726883040.70225: Calling groups_inventory to load vars for managed_node2 28983 1726883040.70228: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883040.70240: Calling all_plugins_play to load vars for managed_node2 28983 1726883040.70244: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883040.70248: Calling groups_plugins_play to load vars for managed_node2 28983 1726883040.72631: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883040.75855: done with get_vars() 28983 1726883040.75892: variable 'ansible_search_path' from source: unknown 28983 1726883040.75894: variable 'ansible_search_path' from source: unknown 28983 1726883040.76118: variable 'omit' from source: magic vars 28983 1726883040.76187: variable 'omit' from source: magic vars 28983 1726883040.76208: variable 'omit' from source: magic vars 28983 1726883040.76212: we have included files to process 28983 1726883040.76213: generating all_blocks data 28983 1726883040.76215: done generating all_blocks data 28983 1726883040.76222: processing included file: fedora.linux_system_roles.network 28983 1726883040.76251: in VariableManager get_vars() 28983 1726883040.76277: done with get_vars() 28983 1726883040.76312: in VariableManager get_vars() 28983 1726883040.76336: done with get_vars() 28983 1726883040.76393: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 28983 1726883040.76578: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 28983 1726883040.76712: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 28983 1726883040.77529: in VariableManager get_vars() 28983 1726883040.77558: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 28983 1726883040.80402: iterating over new_blocks loaded from include file 28983 1726883040.80404: in VariableManager get_vars() 28983 1726883040.80440: done with get_vars() 28983 1726883040.80442: filtering new block on tags 28983 1726883040.80905: done filtering new block on tags 28983 1726883040.80909: in VariableManager get_vars() 28983 1726883040.80929: done with get_vars() 28983 1726883040.80931: filtering new block on tags 28983 1726883040.80955: done filtering new block on tags 28983 1726883040.80957: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node2 28983 1726883040.80965: extending task lists for all hosts with included blocks 28983 1726883040.81137: done extending task lists 28983 1726883040.81139: done processing included files 28983 1726883040.81140: results queue empty 28983 1726883040.81141: checking for any_errors_fatal 28983 1726883040.81146: done checking for any_errors_fatal 28983 1726883040.81148: checking for max_fail_percentage 28983 1726883040.81149: done checking for max_fail_percentage 28983 1726883040.81150: checking to see if all hosts have failed and the running result is not ok 28983 1726883040.81151: done checking to see if all hosts have failed 28983 1726883040.81152: getting the remaining hosts for this loop 28983 1726883040.81154: done getting the remaining hosts for this loop 28983 1726883040.81157: getting the next task for host managed_node2 28983 1726883040.81163: done getting next task for host managed_node2 28983 1726883040.81166: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 28983 1726883040.81170: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883040.81193: getting variables 28983 1726883040.81194: in VariableManager get_vars() 28983 1726883040.81210: Calling all_inventory to load vars for managed_node2 28983 1726883040.81213: Calling groups_inventory to load vars for managed_node2 28983 1726883040.81216: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883040.81222: Calling all_plugins_play to load vars for managed_node2 28983 1726883040.81226: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883040.81230: Calling groups_plugins_play to load vars for managed_node2 28983 1726883040.83374: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883040.87002: done with get_vars() 28983 1726883040.87037: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:44:00 -0400 (0:00:00.191) 0:01:10.870 ****** 28983 1726883040.87323: entering _queue_task() for managed_node2/include_tasks 28983 1726883040.88164: worker is 1 (out of 1 available) 28983 1726883040.88188: exiting _queue_task() for managed_node2/include_tasks 28983 1726883040.88202: done queuing things up, now waiting for results queue to drain 28983 1726883040.88204: waiting for pending results... 28983 1726883040.88547: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 28983 1726883040.88827: in run() - task 0affe814-3a2d-b16d-c0a7-00000000127a 28983 1726883040.88969: variable 'ansible_search_path' from source: unknown 28983 1726883040.88976: variable 'ansible_search_path' from source: unknown 28983 1726883040.89009: calling self._execute() 28983 1726883040.89128: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883040.89138: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883040.89256: variable 'omit' from source: magic vars 28983 1726883040.90425: variable 'ansible_distribution_major_version' from source: facts 28983 1726883040.90450: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883040.90456: _execute() done 28983 1726883040.90460: dumping result to json 28983 1726883040.90465: done dumping result, returning 28983 1726883040.90478: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affe814-3a2d-b16d-c0a7-00000000127a] 28983 1726883040.90493: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000127a 28983 1726883040.90669: no more pending results, returning what we have 28983 1726883040.90676: in VariableManager get_vars() 28983 1726883040.90845: Calling all_inventory to load vars for managed_node2 28983 1726883040.90849: Calling groups_inventory to load vars for managed_node2 28983 1726883040.90852: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883040.90867: Calling all_plugins_play to load vars for managed_node2 28983 1726883040.90871: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883040.90875: Calling groups_plugins_play to load vars for managed_node2 28983 1726883040.91559: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000127a 28983 1726883040.91563: WORKER PROCESS EXITING 28983 1726883040.93594: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883040.97680: done with get_vars() 28983 1726883040.97712: variable 'ansible_search_path' from source: unknown 28983 1726883040.97713: variable 'ansible_search_path' from source: unknown 28983 1726883040.97762: we have included files to process 28983 1726883040.97764: generating all_blocks data 28983 1726883040.97766: done generating all_blocks data 28983 1726883040.97770: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 28983 1726883040.97772: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 28983 1726883040.97778: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 28983 1726883040.98547: done processing included file 28983 1726883040.98549: iterating over new_blocks loaded from include file 28983 1726883040.98551: in VariableManager get_vars() 28983 1726883040.98594: done with get_vars() 28983 1726883040.98596: filtering new block on tags 28983 1726883040.98640: done filtering new block on tags 28983 1726883040.98643: in VariableManager get_vars() 28983 1726883040.98680: done with get_vars() 28983 1726883040.98683: filtering new block on tags 28983 1726883040.98748: done filtering new block on tags 28983 1726883040.98751: in VariableManager get_vars() 28983 1726883040.98787: done with get_vars() 28983 1726883040.98789: filtering new block on tags 28983 1726883040.98853: done filtering new block on tags 28983 1726883040.98856: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node2 28983 1726883040.98862: extending task lists for all hosts with included blocks 28983 1726883041.01918: done extending task lists 28983 1726883041.01920: done processing included files 28983 1726883041.01921: results queue empty 28983 1726883041.01922: checking for any_errors_fatal 28983 1726883041.01926: done checking for any_errors_fatal 28983 1726883041.01927: checking for max_fail_percentage 28983 1726883041.01928: done checking for max_fail_percentage 28983 1726883041.01929: checking to see if all hosts have failed and the running result is not ok 28983 1726883041.01930: done checking to see if all hosts have failed 28983 1726883041.01931: getting the remaining hosts for this loop 28983 1726883041.01933: done getting the remaining hosts for this loop 28983 1726883041.01938: getting the next task for host managed_node2 28983 1726883041.01945: done getting next task for host managed_node2 28983 1726883041.01948: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 28983 1726883041.01953: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883041.01966: getting variables 28983 1726883041.01968: in VariableManager get_vars() 28983 1726883041.02000: Calling all_inventory to load vars for managed_node2 28983 1726883041.02004: Calling groups_inventory to load vars for managed_node2 28983 1726883041.02007: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883041.02014: Calling all_plugins_play to load vars for managed_node2 28983 1726883041.02017: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883041.02021: Calling groups_plugins_play to load vars for managed_node2 28983 1726883041.04324: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883041.10002: done with get_vars() 28983 1726883041.10053: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:44:01 -0400 (0:00:00.228) 0:01:11.099 ****** 28983 1726883041.10165: entering _queue_task() for managed_node2/setup 28983 1726883041.10866: worker is 1 (out of 1 available) 28983 1726883041.10881: exiting _queue_task() for managed_node2/setup 28983 1726883041.10896: done queuing things up, now waiting for results queue to drain 28983 1726883041.10898: waiting for pending results... 28983 1726883041.11556: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 28983 1726883041.11938: in run() - task 0affe814-3a2d-b16d-c0a7-0000000012d1 28983 1726883041.11971: variable 'ansible_search_path' from source: unknown 28983 1726883041.11984: variable 'ansible_search_path' from source: unknown 28983 1726883041.12090: calling self._execute() 28983 1726883041.12348: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883041.12367: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883041.12391: variable 'omit' from source: magic vars 28983 1726883041.13122: variable 'ansible_distribution_major_version' from source: facts 28983 1726883041.13156: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883041.13640: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883041.18812: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883041.19338: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883041.19389: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883041.19758: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883041.19762: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883041.20215: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883041.20223: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883041.20359: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883041.20567: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883041.20571: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883041.20799: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883041.21077: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883041.21081: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883041.21215: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883041.21262: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883041.21642: variable '__network_required_facts' from source: role '' defaults 28983 1726883041.21906: variable 'ansible_facts' from source: unknown 28983 1726883041.24572: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 28983 1726883041.24576: when evaluation is False, skipping this task 28983 1726883041.24579: _execute() done 28983 1726883041.24586: dumping result to json 28983 1726883041.24589: done dumping result, returning 28983 1726883041.24600: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affe814-3a2d-b16d-c0a7-0000000012d1] 28983 1726883041.24605: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000012d1 skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28983 1726883041.24889: no more pending results, returning what we have 28983 1726883041.24894: results queue empty 28983 1726883041.24895: checking for any_errors_fatal 28983 1726883041.24897: done checking for any_errors_fatal 28983 1726883041.24898: checking for max_fail_percentage 28983 1726883041.24900: done checking for max_fail_percentage 28983 1726883041.24901: checking to see if all hosts have failed and the running result is not ok 28983 1726883041.24902: done checking to see if all hosts have failed 28983 1726883041.24903: getting the remaining hosts for this loop 28983 1726883041.24905: done getting the remaining hosts for this loop 28983 1726883041.24910: getting the next task for host managed_node2 28983 1726883041.24921: done getting next task for host managed_node2 28983 1726883041.24925: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 28983 1726883041.24932: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883041.24958: getting variables 28983 1726883041.24959: in VariableManager get_vars() 28983 1726883041.25000: Calling all_inventory to load vars for managed_node2 28983 1726883041.25157: Calling groups_inventory to load vars for managed_node2 28983 1726883041.25162: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883041.25249: Calling all_plugins_play to load vars for managed_node2 28983 1726883041.25254: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883041.25291: Calling groups_plugins_play to load vars for managed_node2 28983 1726883041.25866: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000012d1 28983 1726883041.25876: WORKER PROCESS EXITING 28983 1726883041.28220: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883041.32448: done with get_vars() 28983 1726883041.32485: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:44:01 -0400 (0:00:00.225) 0:01:11.324 ****** 28983 1726883041.32728: entering _queue_task() for managed_node2/stat 28983 1726883041.33475: worker is 1 (out of 1 available) 28983 1726883041.33488: exiting _queue_task() for managed_node2/stat 28983 1726883041.33501: done queuing things up, now waiting for results queue to drain 28983 1726883041.33503: waiting for pending results... 28983 1726883041.33886: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 28983 1726883041.34132: in run() - task 0affe814-3a2d-b16d-c0a7-0000000012d3 28983 1726883041.34189: variable 'ansible_search_path' from source: unknown 28983 1726883041.34193: variable 'ansible_search_path' from source: unknown 28983 1726883041.34220: calling self._execute() 28983 1726883041.34379: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883041.34407: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883041.34436: variable 'omit' from source: magic vars 28983 1726883041.34998: variable 'ansible_distribution_major_version' from source: facts 28983 1726883041.35006: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883041.35355: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883041.35737: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883041.35986: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883041.36145: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883041.36229: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883041.36427: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883041.36514: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883041.36559: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883041.36745: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883041.36815: variable '__network_is_ostree' from source: set_fact 28983 1726883041.36838: Evaluated conditional (not __network_is_ostree is defined): False 28983 1726883041.36848: when evaluation is False, skipping this task 28983 1726883041.36864: _execute() done 28983 1726883041.36899: dumping result to json 28983 1726883041.36910: done dumping result, returning 28983 1726883041.36984: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affe814-3a2d-b16d-c0a7-0000000012d3] 28983 1726883041.36992: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000012d3 skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 28983 1726883041.37259: no more pending results, returning what we have 28983 1726883041.37265: results queue empty 28983 1726883041.37267: checking for any_errors_fatal 28983 1726883041.37283: done checking for any_errors_fatal 28983 1726883041.37284: checking for max_fail_percentage 28983 1726883041.37287: done checking for max_fail_percentage 28983 1726883041.37288: checking to see if all hosts have failed and the running result is not ok 28983 1726883041.37289: done checking to see if all hosts have failed 28983 1726883041.37290: getting the remaining hosts for this loop 28983 1726883041.37293: done getting the remaining hosts for this loop 28983 1726883041.37300: getting the next task for host managed_node2 28983 1726883041.37312: done getting next task for host managed_node2 28983 1726883041.37321: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 28983 1726883041.37329: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883041.37367: getting variables 28983 1726883041.37369: in VariableManager get_vars() 28983 1726883041.37423: Calling all_inventory to load vars for managed_node2 28983 1726883041.37426: Calling groups_inventory to load vars for managed_node2 28983 1726883041.37429: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883041.37765: Calling all_plugins_play to load vars for managed_node2 28983 1726883041.37769: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883041.37775: Calling groups_plugins_play to load vars for managed_node2 28983 1726883041.38633: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000012d3 28983 1726883041.38640: WORKER PROCESS EXITING 28983 1726883041.41316: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883041.46444: done with get_vars() 28983 1726883041.46496: done getting variables 28983 1726883041.46633: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:44:01 -0400 (0:00:00.139) 0:01:11.464 ****** 28983 1726883041.46690: entering _queue_task() for managed_node2/set_fact 28983 1726883041.47267: worker is 1 (out of 1 available) 28983 1726883041.47280: exiting _queue_task() for managed_node2/set_fact 28983 1726883041.47294: done queuing things up, now waiting for results queue to drain 28983 1726883041.47296: waiting for pending results... 28983 1726883041.47563: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 28983 1726883041.47787: in run() - task 0affe814-3a2d-b16d-c0a7-0000000012d4 28983 1726883041.47864: variable 'ansible_search_path' from source: unknown 28983 1726883041.47868: variable 'ansible_search_path' from source: unknown 28983 1726883041.47878: calling self._execute() 28983 1726883041.48013: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883041.48027: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883041.48047: variable 'omit' from source: magic vars 28983 1726883041.48627: variable 'ansible_distribution_major_version' from source: facts 28983 1726883041.48632: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883041.48817: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883041.49190: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883041.49255: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883041.49312: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883041.49362: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883041.49477: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883041.49523: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883041.49567: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883041.49616: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883041.49762: variable '__network_is_ostree' from source: set_fact 28983 1726883041.49766: Evaluated conditional (not __network_is_ostree is defined): False 28983 1726883041.49769: when evaluation is False, skipping this task 28983 1726883041.49771: _execute() done 28983 1726883041.49774: dumping result to json 28983 1726883041.49824: done dumping result, returning 28983 1726883041.49830: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affe814-3a2d-b16d-c0a7-0000000012d4] 28983 1726883041.49833: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000012d4 28983 1726883041.50038: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000012d4 28983 1726883041.50041: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 28983 1726883041.50106: no more pending results, returning what we have 28983 1726883041.50111: results queue empty 28983 1726883041.50112: checking for any_errors_fatal 28983 1726883041.50119: done checking for any_errors_fatal 28983 1726883041.50120: checking for max_fail_percentage 28983 1726883041.50123: done checking for max_fail_percentage 28983 1726883041.50124: checking to see if all hosts have failed and the running result is not ok 28983 1726883041.50126: done checking to see if all hosts have failed 28983 1726883041.50127: getting the remaining hosts for this loop 28983 1726883041.50129: done getting the remaining hosts for this loop 28983 1726883041.50137: getting the next task for host managed_node2 28983 1726883041.50152: done getting next task for host managed_node2 28983 1726883041.50157: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 28983 1726883041.50164: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883041.50195: getting variables 28983 1726883041.50198: in VariableManager get_vars() 28983 1726883041.50362: Calling all_inventory to load vars for managed_node2 28983 1726883041.50366: Calling groups_inventory to load vars for managed_node2 28983 1726883041.50369: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883041.50380: Calling all_plugins_play to load vars for managed_node2 28983 1726883041.50384: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883041.50388: Calling groups_plugins_play to load vars for managed_node2 28983 1726883041.55213: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883041.59458: done with get_vars() 28983 1726883041.59613: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:44:01 -0400 (0:00:00.131) 0:01:11.596 ****** 28983 1726883041.59857: entering _queue_task() for managed_node2/service_facts 28983 1726883041.60801: worker is 1 (out of 1 available) 28983 1726883041.60811: exiting _queue_task() for managed_node2/service_facts 28983 1726883041.60825: done queuing things up, now waiting for results queue to drain 28983 1726883041.60827: waiting for pending results... 28983 1726883041.61081: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running 28983 1726883041.61286: in run() - task 0affe814-3a2d-b16d-c0a7-0000000012d6 28983 1726883041.61291: variable 'ansible_search_path' from source: unknown 28983 1726883041.61294: variable 'ansible_search_path' from source: unknown 28983 1726883041.61325: calling self._execute() 28983 1726883041.61453: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883041.61468: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883041.61489: variable 'omit' from source: magic vars 28983 1726883041.61977: variable 'ansible_distribution_major_version' from source: facts 28983 1726883041.61996: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883041.62008: variable 'omit' from source: magic vars 28983 1726883041.62126: variable 'omit' from source: magic vars 28983 1726883041.62187: variable 'omit' from source: magic vars 28983 1726883041.62243: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883041.62301: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883041.62372: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883041.62381: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883041.62385: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883041.62428: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883041.62441: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883041.62451: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883041.62697: Set connection var ansible_connection to ssh 28983 1726883041.62702: Set connection var ansible_shell_executable to /bin/sh 28983 1726883041.62705: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883041.62707: Set connection var ansible_timeout to 10 28983 1726883041.62710: Set connection var ansible_pipelining to False 28983 1726883041.62712: Set connection var ansible_shell_type to sh 28983 1726883041.62715: variable 'ansible_shell_executable' from source: unknown 28983 1726883041.62717: variable 'ansible_connection' from source: unknown 28983 1726883041.62720: variable 'ansible_module_compression' from source: unknown 28983 1726883041.62722: variable 'ansible_shell_type' from source: unknown 28983 1726883041.62724: variable 'ansible_shell_executable' from source: unknown 28983 1726883041.62726: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883041.62808: variable 'ansible_pipelining' from source: unknown 28983 1726883041.62812: variable 'ansible_timeout' from source: unknown 28983 1726883041.62814: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883041.63009: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28983 1726883041.63039: variable 'omit' from source: magic vars 28983 1726883041.63053: starting attempt loop 28983 1726883041.63067: running the handler 28983 1726883041.63087: _low_level_execute_command(): starting 28983 1726883041.63102: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726883041.63920: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883041.63948: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883041.64022: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883041.64089: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883041.64152: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883041.64156: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883041.64344: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883041.66047: stdout chunk (state=3): >>>/root <<< 28983 1726883041.66237: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883041.66240: stdout chunk (state=3): >>><<< 28983 1726883041.66243: stderr chunk (state=3): >>><<< 28983 1726883041.66359: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883041.66366: _low_level_execute_command(): starting 28983 1726883041.66380: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726883041.6634326-31520-80068468049 `" && echo ansible-tmp-1726883041.6634326-31520-80068468049="` echo /root/.ansible/tmp/ansible-tmp-1726883041.6634326-31520-80068468049 `" ) && sleep 0' 28983 1726883041.68141: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883041.68145: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883041.68149: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883041.68160: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883041.68172: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883041.68187: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883041.68288: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883041.70374: stdout chunk (state=3): >>>ansible-tmp-1726883041.6634326-31520-80068468049=/root/.ansible/tmp/ansible-tmp-1726883041.6634326-31520-80068468049 <<< 28983 1726883041.70527: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883041.70605: stderr chunk (state=3): >>><<< 28983 1726883041.70608: stdout chunk (state=3): >>><<< 28983 1726883041.70645: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726883041.6634326-31520-80068468049=/root/.ansible/tmp/ansible-tmp-1726883041.6634326-31520-80068468049 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883041.70695: variable 'ansible_module_compression' from source: unknown 28983 1726883041.70846: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 28983 1726883041.70985: variable 'ansible_facts' from source: unknown 28983 1726883041.71165: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726883041.6634326-31520-80068468049/AnsiballZ_service_facts.py 28983 1726883041.71614: Sending initial data 28983 1726883041.71617: Sent initial data (158 bytes) 28983 1726883041.72961: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883041.73009: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883041.73071: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883041.73091: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883041.73432: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883041.75133: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726883041.75199: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726883041.75267: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpqgvdohp1 /root/.ansible/tmp/ansible-tmp-1726883041.6634326-31520-80068468049/AnsiballZ_service_facts.py <<< 28983 1726883041.75274: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726883041.6634326-31520-80068468049/AnsiballZ_service_facts.py" <<< 28983 1726883041.75485: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpqgvdohp1" to remote "/root/.ansible/tmp/ansible-tmp-1726883041.6634326-31520-80068468049/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726883041.6634326-31520-80068468049/AnsiballZ_service_facts.py" <<< 28983 1726883041.78242: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883041.78289: stderr chunk (state=3): >>><<< 28983 1726883041.78295: stdout chunk (state=3): >>><<< 28983 1726883041.78319: done transferring module to remote 28983 1726883041.78378: _low_level_execute_command(): starting 28983 1726883041.78382: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726883041.6634326-31520-80068468049/ /root/.ansible/tmp/ansible-tmp-1726883041.6634326-31520-80068468049/AnsiballZ_service_facts.py && sleep 0' 28983 1726883041.79840: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883041.79844: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883041.79847: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883041.79849: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883041.80030: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883041.80170: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883041.80193: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883041.80207: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883041.80323: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883041.82533: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883041.82544: stderr chunk (state=3): >>><<< 28983 1726883041.82546: stdout chunk (state=3): >>><<< 28983 1726883041.82549: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883041.82552: _low_level_execute_command(): starting 28983 1726883041.82554: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726883041.6634326-31520-80068468049/AnsiballZ_service_facts.py && sleep 0' 28983 1726883041.83402: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883041.83406: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883041.83409: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883041.83411: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883041.83414: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883041.83416: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883041.83419: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883041.83586: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883041.83696: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883043.82727: stdout chunk (state=3): >>> <<< 28983 1726883043.82805: stdout chunk (state=3): >>>{"ansible_facts": {"services": {"network": {"name": "network", "state": "running", "status": "enabled", "source": "sysv"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "import-state.service": {"name": "import-state.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "ip6tables.service": {"name": "ip6tables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "iptables.service": {"name": "iptables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "generated", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.servi<<< 28983 1726883043.82882: stdout chunk (state=3): >>>ce": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "loadmodules.service": {"name": "loadmodules.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 28983 1726883043.84541: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. <<< 28983 1726883043.84662: stderr chunk (state=3): >>><<< 28983 1726883043.84666: stdout chunk (state=3): >>><<< 28983 1726883043.84676: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"network": {"name": "network", "state": "running", "status": "enabled", "source": "sysv"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "import-state.service": {"name": "import-state.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "ip6tables.service": {"name": "ip6tables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "iptables.service": {"name": "iptables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "generated", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "loadmodules.service": {"name": "loadmodules.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726883043.87408: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726883041.6634326-31520-80068468049/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726883043.87470: _low_level_execute_command(): starting 28983 1726883043.87476: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726883041.6634326-31520-80068468049/ > /dev/null 2>&1 && sleep 0' 28983 1726883043.88900: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883043.88956: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883043.89016: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883043.89148: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883043.91107: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883043.91292: stderr chunk (state=3): >>><<< 28983 1726883043.91296: stdout chunk (state=3): >>><<< 28983 1726883043.91299: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883043.91301: handler run complete 28983 1726883043.91887: variable 'ansible_facts' from source: unknown 28983 1726883043.92423: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883043.93650: variable 'ansible_facts' from source: unknown 28983 1726883043.93887: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883043.94504: attempt loop complete, returning result 28983 1726883043.94512: _execute() done 28983 1726883043.94515: dumping result to json 28983 1726883043.94606: done dumping result, returning 28983 1726883043.94616: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running [0affe814-3a2d-b16d-c0a7-0000000012d6] 28983 1726883043.94622: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000012d6 28983 1726883043.96568: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000012d6 28983 1726883043.96573: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28983 1726883043.96744: no more pending results, returning what we have 28983 1726883043.96747: results queue empty 28983 1726883043.96748: checking for any_errors_fatal 28983 1726883043.96755: done checking for any_errors_fatal 28983 1726883043.96756: checking for max_fail_percentage 28983 1726883043.96758: done checking for max_fail_percentage 28983 1726883043.96759: checking to see if all hosts have failed and the running result is not ok 28983 1726883043.96760: done checking to see if all hosts have failed 28983 1726883043.96761: getting the remaining hosts for this loop 28983 1726883043.96763: done getting the remaining hosts for this loop 28983 1726883043.96768: getting the next task for host managed_node2 28983 1726883043.96778: done getting next task for host managed_node2 28983 1726883043.96783: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 28983 1726883043.96790: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883043.96805: getting variables 28983 1726883043.96807: in VariableManager get_vars() 28983 1726883043.96879: Calling all_inventory to load vars for managed_node2 28983 1726883043.96899: Calling groups_inventory to load vars for managed_node2 28983 1726883043.96903: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883043.96914: Calling all_plugins_play to load vars for managed_node2 28983 1726883043.96917: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883043.96927: Calling groups_plugins_play to load vars for managed_node2 28983 1726883043.99412: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883044.03355: done with get_vars() 28983 1726883044.03396: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:44:04 -0400 (0:00:02.436) 0:01:14.033 ****** 28983 1726883044.03529: entering _queue_task() for managed_node2/package_facts 28983 1726883044.04031: worker is 1 (out of 1 available) 28983 1726883044.04049: exiting _queue_task() for managed_node2/package_facts 28983 1726883044.04063: done queuing things up, now waiting for results queue to drain 28983 1726883044.04065: waiting for pending results... 28983 1726883044.04754: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 28983 1726883044.05103: in run() - task 0affe814-3a2d-b16d-c0a7-0000000012d7 28983 1726883044.05120: variable 'ansible_search_path' from source: unknown 28983 1726883044.05125: variable 'ansible_search_path' from source: unknown 28983 1726883044.05305: calling self._execute() 28983 1726883044.05467: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883044.05474: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883044.05491: variable 'omit' from source: magic vars 28983 1726883044.06204: variable 'ansible_distribution_major_version' from source: facts 28983 1726883044.06431: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883044.06441: variable 'omit' from source: magic vars 28983 1726883044.06731: variable 'omit' from source: magic vars 28983 1726883044.06735: variable 'omit' from source: magic vars 28983 1726883044.06902: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883044.07045: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883044.07071: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883044.07095: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883044.07106: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883044.07144: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883044.07148: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883044.07180: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883044.07519: Set connection var ansible_connection to ssh 28983 1726883044.07531: Set connection var ansible_shell_executable to /bin/sh 28983 1726883044.07616: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883044.07620: Set connection var ansible_timeout to 10 28983 1726883044.07622: Set connection var ansible_pipelining to False 28983 1726883044.07625: Set connection var ansible_shell_type to sh 28983 1726883044.07627: variable 'ansible_shell_executable' from source: unknown 28983 1726883044.07629: variable 'ansible_connection' from source: unknown 28983 1726883044.07632: variable 'ansible_module_compression' from source: unknown 28983 1726883044.07636: variable 'ansible_shell_type' from source: unknown 28983 1726883044.07639: variable 'ansible_shell_executable' from source: unknown 28983 1726883044.07641: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883044.07724: variable 'ansible_pipelining' from source: unknown 28983 1726883044.07728: variable 'ansible_timeout' from source: unknown 28983 1726883044.07747: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883044.08147: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28983 1726883044.08206: variable 'omit' from source: magic vars 28983 1726883044.08212: starting attempt loop 28983 1726883044.08215: running the handler 28983 1726883044.08232: _low_level_execute_command(): starting 28983 1726883044.08270: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726883044.09066: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883044.09252: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883044.09439: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883044.09442: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883044.11085: stdout chunk (state=3): >>>/root <<< 28983 1726883044.11306: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883044.11312: stdout chunk (state=3): >>><<< 28983 1726883044.11321: stderr chunk (state=3): >>><<< 28983 1726883044.11362: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883044.11380: _low_level_execute_command(): starting 28983 1726883044.11388: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726883044.1136217-31652-133250902082128 `" && echo ansible-tmp-1726883044.1136217-31652-133250902082128="` echo /root/.ansible/tmp/ansible-tmp-1726883044.1136217-31652-133250902082128 `" ) && sleep 0' 28983 1726883044.12055: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883044.12064: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883044.12079: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883044.12096: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883044.12110: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726883044.12119: stderr chunk (state=3): >>>debug2: match not found <<< 28983 1726883044.12136: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883044.12151: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28983 1726883044.12248: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883044.12265: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883044.12369: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883044.14506: stdout chunk (state=3): >>>ansible-tmp-1726883044.1136217-31652-133250902082128=/root/.ansible/tmp/ansible-tmp-1726883044.1136217-31652-133250902082128 <<< 28983 1726883044.14751: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883044.14760: stdout chunk (state=3): >>><<< 28983 1726883044.14767: stderr chunk (state=3): >>><<< 28983 1726883044.14889: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726883044.1136217-31652-133250902082128=/root/.ansible/tmp/ansible-tmp-1726883044.1136217-31652-133250902082128 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883044.14892: variable 'ansible_module_compression' from source: unknown 28983 1726883044.14923: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 28983 1726883044.15195: variable 'ansible_facts' from source: unknown 28983 1726883044.15398: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726883044.1136217-31652-133250902082128/AnsiballZ_package_facts.py 28983 1726883044.15786: Sending initial data 28983 1726883044.15791: Sent initial data (162 bytes) 28983 1726883044.17257: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883044.17430: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883044.17444: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883044.17461: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883044.17620: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883044.19345: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726883044.19349: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726883044.19406: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmphu7s9360 /root/.ansible/tmp/ansible-tmp-1726883044.1136217-31652-133250902082128/AnsiballZ_package_facts.py <<< 28983 1726883044.19426: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726883044.1136217-31652-133250902082128/AnsiballZ_package_facts.py" <<< 28983 1726883044.19505: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmphu7s9360" to remote "/root/.ansible/tmp/ansible-tmp-1726883044.1136217-31652-133250902082128/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726883044.1136217-31652-133250902082128/AnsiballZ_package_facts.py" <<< 28983 1726883044.22454: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883044.22588: stderr chunk (state=3): >>><<< 28983 1726883044.22591: stdout chunk (state=3): >>><<< 28983 1726883044.22594: done transferring module to remote 28983 1726883044.22596: _low_level_execute_command(): starting 28983 1726883044.22599: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726883044.1136217-31652-133250902082128/ /root/.ansible/tmp/ansible-tmp-1726883044.1136217-31652-133250902082128/AnsiballZ_package_facts.py && sleep 0' 28983 1726883044.23366: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883044.23391: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28983 1726883044.23464: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883044.23517: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883044.23549: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883044.23577: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883044.23737: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883044.25696: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883044.25700: stdout chunk (state=3): >>><<< 28983 1726883044.25812: stderr chunk (state=3): >>><<< 28983 1726883044.25825: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883044.25830: _low_level_execute_command(): starting 28983 1726883044.25833: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726883044.1136217-31652-133250902082128/AnsiballZ_package_facts.py && sleep 0' 28983 1726883044.26364: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883044.26385: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883044.26407: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883044.26458: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726883044.26494: stderr chunk (state=3): >>>debug2: match found <<< 28983 1726883044.26513: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883044.26601: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883044.26650: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883044.26779: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883044.90257: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "a<<< 28983 1726883044.90268: stdout chunk (state=3): >>>rch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "r<<< 28983 1726883044.90338: stdout chunk (state=3): >>>pm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, <<< 28983 1726883044.90420: stdout chunk (state=3): >>>"arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", <<< 28983 1726883044.90476: stdout chunk (state=3): >>>"release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_64", <<< 28983 1726883044.90502: stdout chunk (state=3): >>>"source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "9.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chkconfig": [{"name": "chkconfig", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts": [{"name": "initscripts", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "network-scripts": [{"name": "network-scripts", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 28983 1726883044.92491: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883044.92529: stderr chunk (state=3): >>>Shared connection to 10.31.46.139 closed. <<< 28983 1726883044.92533: stdout chunk (state=3): >>><<< 28983 1726883044.92538: stderr chunk (state=3): >>><<< 28983 1726883044.92752: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "9.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chkconfig": [{"name": "chkconfig", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts": [{"name": "initscripts", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "network-scripts": [{"name": "network-scripts", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726883044.97522: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726883044.1136217-31652-133250902082128/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726883044.97571: _low_level_execute_command(): starting 28983 1726883044.97577: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726883044.1136217-31652-133250902082128/ > /dev/null 2>&1 && sleep 0' 28983 1726883044.98839: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883044.98903: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883044.98989: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883044.99051: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883044.99074: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883044.99118: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883044.99260: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883045.01439: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883045.01443: stderr chunk (state=3): >>><<< 28983 1726883045.01445: stdout chunk (state=3): >>><<< 28983 1726883045.01448: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883045.01450: handler run complete 28983 1726883045.02935: variable 'ansible_facts' from source: unknown 28983 1726883045.03813: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883045.16327: variable 'ansible_facts' from source: unknown 28983 1726883045.17728: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883045.20159: attempt loop complete, returning result 28983 1726883045.20195: _execute() done 28983 1726883045.20387: dumping result to json 28983 1726883045.20946: done dumping result, returning 28983 1726883045.20970: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affe814-3a2d-b16d-c0a7-0000000012d7] 28983 1726883045.20980: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000012d7 28983 1726883045.26025: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000012d7 28983 1726883045.26028: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28983 1726883045.26213: no more pending results, returning what we have 28983 1726883045.26217: results queue empty 28983 1726883045.26218: checking for any_errors_fatal 28983 1726883045.26223: done checking for any_errors_fatal 28983 1726883045.26225: checking for max_fail_percentage 28983 1726883045.26227: done checking for max_fail_percentage 28983 1726883045.26228: checking to see if all hosts have failed and the running result is not ok 28983 1726883045.26229: done checking to see if all hosts have failed 28983 1726883045.26230: getting the remaining hosts for this loop 28983 1726883045.26231: done getting the remaining hosts for this loop 28983 1726883045.26238: getting the next task for host managed_node2 28983 1726883045.26247: done getting next task for host managed_node2 28983 1726883045.26252: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 28983 1726883045.26257: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883045.26271: getting variables 28983 1726883045.26275: in VariableManager get_vars() 28983 1726883045.26310: Calling all_inventory to load vars for managed_node2 28983 1726883045.26313: Calling groups_inventory to load vars for managed_node2 28983 1726883045.26316: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883045.26326: Calling all_plugins_play to load vars for managed_node2 28983 1726883045.26329: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883045.26336: Calling groups_plugins_play to load vars for managed_node2 28983 1726883045.33389: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883045.36288: done with get_vars() 28983 1726883045.36331: done getting variables 28983 1726883045.36399: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:44:05 -0400 (0:00:01.329) 0:01:15.362 ****** 28983 1726883045.36445: entering _queue_task() for managed_node2/debug 28983 1726883045.36975: worker is 1 (out of 1 available) 28983 1726883045.36987: exiting _queue_task() for managed_node2/debug 28983 1726883045.36998: done queuing things up, now waiting for results queue to drain 28983 1726883045.37001: waiting for pending results... 28983 1726883045.37244: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 28983 1726883045.37415: in run() - task 0affe814-3a2d-b16d-c0a7-00000000127b 28983 1726883045.37639: variable 'ansible_search_path' from source: unknown 28983 1726883045.37644: variable 'ansible_search_path' from source: unknown 28983 1726883045.37649: calling self._execute() 28983 1726883045.37651: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883045.37655: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883045.37660: variable 'omit' from source: magic vars 28983 1726883045.38118: variable 'ansible_distribution_major_version' from source: facts 28983 1726883045.38140: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883045.38155: variable 'omit' from source: magic vars 28983 1726883045.38255: variable 'omit' from source: magic vars 28983 1726883045.38392: variable 'network_provider' from source: set_fact 28983 1726883045.38420: variable 'omit' from source: magic vars 28983 1726883045.38484: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883045.38531: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883045.38569: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883045.38600: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883045.38620: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883045.38667: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883045.38680: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883045.38690: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883045.38821: Set connection var ansible_connection to ssh 28983 1726883045.38843: Set connection var ansible_shell_executable to /bin/sh 28983 1726883045.38859: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883045.38883: Set connection var ansible_timeout to 10 28983 1726883045.38980: Set connection var ansible_pipelining to False 28983 1726883045.38983: Set connection var ansible_shell_type to sh 28983 1726883045.38986: variable 'ansible_shell_executable' from source: unknown 28983 1726883045.38988: variable 'ansible_connection' from source: unknown 28983 1726883045.38991: variable 'ansible_module_compression' from source: unknown 28983 1726883045.38993: variable 'ansible_shell_type' from source: unknown 28983 1726883045.38995: variable 'ansible_shell_executable' from source: unknown 28983 1726883045.38997: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883045.38999: variable 'ansible_pipelining' from source: unknown 28983 1726883045.39001: variable 'ansible_timeout' from source: unknown 28983 1726883045.39003: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883045.39150: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883045.39171: variable 'omit' from source: magic vars 28983 1726883045.39185: starting attempt loop 28983 1726883045.39198: running the handler 28983 1726883045.39260: handler run complete 28983 1726883045.39288: attempt loop complete, returning result 28983 1726883045.39295: _execute() done 28983 1726883045.39308: dumping result to json 28983 1726883045.39316: done dumping result, returning 28983 1726883045.39414: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [0affe814-3a2d-b16d-c0a7-00000000127b] 28983 1726883045.39418: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000127b 28983 1726883045.39500: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000127b 28983 1726883045.39503: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: Using network provider: nm 28983 1726883045.39597: no more pending results, returning what we have 28983 1726883045.39602: results queue empty 28983 1726883045.39603: checking for any_errors_fatal 28983 1726883045.39620: done checking for any_errors_fatal 28983 1726883045.39621: checking for max_fail_percentage 28983 1726883045.39623: done checking for max_fail_percentage 28983 1726883045.39624: checking to see if all hosts have failed and the running result is not ok 28983 1726883045.39625: done checking to see if all hosts have failed 28983 1726883045.39626: getting the remaining hosts for this loop 28983 1726883045.39629: done getting the remaining hosts for this loop 28983 1726883045.39636: getting the next task for host managed_node2 28983 1726883045.39647: done getting next task for host managed_node2 28983 1726883045.39652: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 28983 1726883045.39658: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883045.39679: getting variables 28983 1726883045.39680: in VariableManager get_vars() 28983 1726883045.39725: Calling all_inventory to load vars for managed_node2 28983 1726883045.39729: Calling groups_inventory to load vars for managed_node2 28983 1726883045.39732: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883045.39946: Calling all_plugins_play to load vars for managed_node2 28983 1726883045.39951: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883045.39955: Calling groups_plugins_play to load vars for managed_node2 28983 1726883045.42317: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883045.45307: done with get_vars() 28983 1726883045.45351: done getting variables 28983 1726883045.45421: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:44:05 -0400 (0:00:00.090) 0:01:15.452 ****** 28983 1726883045.45471: entering _queue_task() for managed_node2/fail 28983 1726883045.45959: worker is 1 (out of 1 available) 28983 1726883045.45975: exiting _queue_task() for managed_node2/fail 28983 1726883045.45989: done queuing things up, now waiting for results queue to drain 28983 1726883045.45990: waiting for pending results... 28983 1726883045.46339: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 28983 1726883045.46428: in run() - task 0affe814-3a2d-b16d-c0a7-00000000127c 28983 1726883045.46458: variable 'ansible_search_path' from source: unknown 28983 1726883045.46468: variable 'ansible_search_path' from source: unknown 28983 1726883045.46516: calling self._execute() 28983 1726883045.46640: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883045.46655: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883045.46682: variable 'omit' from source: magic vars 28983 1726883045.47143: variable 'ansible_distribution_major_version' from source: facts 28983 1726883045.47164: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883045.47330: variable 'network_state' from source: role '' defaults 28983 1726883045.47540: Evaluated conditional (network_state != {}): False 28983 1726883045.47543: when evaluation is False, skipping this task 28983 1726883045.47545: _execute() done 28983 1726883045.47548: dumping result to json 28983 1726883045.47550: done dumping result, returning 28983 1726883045.47552: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affe814-3a2d-b16d-c0a7-00000000127c] 28983 1726883045.47555: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000127c 28983 1726883045.47633: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000127c 28983 1726883045.47638: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28983 1726883045.47698: no more pending results, returning what we have 28983 1726883045.47704: results queue empty 28983 1726883045.47705: checking for any_errors_fatal 28983 1726883045.47711: done checking for any_errors_fatal 28983 1726883045.47712: checking for max_fail_percentage 28983 1726883045.47714: done checking for max_fail_percentage 28983 1726883045.47716: checking to see if all hosts have failed and the running result is not ok 28983 1726883045.47716: done checking to see if all hosts have failed 28983 1726883045.47717: getting the remaining hosts for this loop 28983 1726883045.47719: done getting the remaining hosts for this loop 28983 1726883045.47725: getting the next task for host managed_node2 28983 1726883045.47738: done getting next task for host managed_node2 28983 1726883045.47742: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 28983 1726883045.47750: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883045.47785: getting variables 28983 1726883045.47787: in VariableManager get_vars() 28983 1726883045.47831: Calling all_inventory to load vars for managed_node2 28983 1726883045.48368: Calling groups_inventory to load vars for managed_node2 28983 1726883045.48376: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883045.48386: Calling all_plugins_play to load vars for managed_node2 28983 1726883045.48390: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883045.48394: Calling groups_plugins_play to load vars for managed_node2 28983 1726883045.50660: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883045.57048: done with get_vars() 28983 1726883045.57094: done getting variables 28983 1726883045.57167: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:44:05 -0400 (0:00:00.117) 0:01:15.569 ****** 28983 1726883045.57216: entering _queue_task() for managed_node2/fail 28983 1726883045.57630: worker is 1 (out of 1 available) 28983 1726883045.57645: exiting _queue_task() for managed_node2/fail 28983 1726883045.57660: done queuing things up, now waiting for results queue to drain 28983 1726883045.57662: waiting for pending results... 28983 1726883045.58055: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 28983 1726883045.58198: in run() - task 0affe814-3a2d-b16d-c0a7-00000000127d 28983 1726883045.58222: variable 'ansible_search_path' from source: unknown 28983 1726883045.58233: variable 'ansible_search_path' from source: unknown 28983 1726883045.58287: calling self._execute() 28983 1726883045.58420: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883045.58436: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883045.58454: variable 'omit' from source: magic vars 28983 1726883045.58921: variable 'ansible_distribution_major_version' from source: facts 28983 1726883045.58940: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883045.59092: variable 'network_state' from source: role '' defaults 28983 1726883045.59110: Evaluated conditional (network_state != {}): False 28983 1726883045.59120: when evaluation is False, skipping this task 28983 1726883045.59133: _execute() done 28983 1726883045.59145: dumping result to json 28983 1726883045.59240: done dumping result, returning 28983 1726883045.59245: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affe814-3a2d-b16d-c0a7-00000000127d] 28983 1726883045.59249: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000127d 28983 1726883045.59325: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000127d 28983 1726883045.59329: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28983 1726883045.59399: no more pending results, returning what we have 28983 1726883045.59404: results queue empty 28983 1726883045.59405: checking for any_errors_fatal 28983 1726883045.59414: done checking for any_errors_fatal 28983 1726883045.59415: checking for max_fail_percentage 28983 1726883045.59417: done checking for max_fail_percentage 28983 1726883045.59418: checking to see if all hosts have failed and the running result is not ok 28983 1726883045.59419: done checking to see if all hosts have failed 28983 1726883045.59420: getting the remaining hosts for this loop 28983 1726883045.59422: done getting the remaining hosts for this loop 28983 1726883045.59427: getting the next task for host managed_node2 28983 1726883045.59440: done getting next task for host managed_node2 28983 1726883045.59445: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 28983 1726883045.59453: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883045.59483: getting variables 28983 1726883045.59485: in VariableManager get_vars() 28983 1726883045.59530: Calling all_inventory to load vars for managed_node2 28983 1726883045.59736: Calling groups_inventory to load vars for managed_node2 28983 1726883045.59741: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883045.59751: Calling all_plugins_play to load vars for managed_node2 28983 1726883045.59755: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883045.59759: Calling groups_plugins_play to load vars for managed_node2 28983 1726883045.62109: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883045.65010: done with get_vars() 28983 1726883045.65047: done getting variables 28983 1726883045.65115: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:44:05 -0400 (0:00:00.079) 0:01:15.649 ****** 28983 1726883045.65158: entering _queue_task() for managed_node2/fail 28983 1726883045.65473: worker is 1 (out of 1 available) 28983 1726883045.65487: exiting _queue_task() for managed_node2/fail 28983 1726883045.65501: done queuing things up, now waiting for results queue to drain 28983 1726883045.65503: waiting for pending results... 28983 1726883045.65865: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 28983 1726883045.66007: in run() - task 0affe814-3a2d-b16d-c0a7-00000000127e 28983 1726883045.66140: variable 'ansible_search_path' from source: unknown 28983 1726883045.66144: variable 'ansible_search_path' from source: unknown 28983 1726883045.66147: calling self._execute() 28983 1726883045.66203: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883045.66217: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883045.66238: variable 'omit' from source: magic vars 28983 1726883045.66688: variable 'ansible_distribution_major_version' from source: facts 28983 1726883045.66714: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883045.66966: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883045.69648: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883045.69743: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883045.69789: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883045.69853: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883045.69882: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883045.70140: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883045.70143: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883045.70146: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883045.70148: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883045.70150: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883045.70238: variable 'ansible_distribution_major_version' from source: facts 28983 1726883045.70260: Evaluated conditional (ansible_distribution_major_version | int > 9): True 28983 1726883045.70411: variable 'ansible_distribution' from source: facts 28983 1726883045.70420: variable '__network_rh_distros' from source: role '' defaults 28983 1726883045.70437: Evaluated conditional (ansible_distribution in __network_rh_distros): False 28983 1726883045.70446: when evaluation is False, skipping this task 28983 1726883045.70454: _execute() done 28983 1726883045.70463: dumping result to json 28983 1726883045.70472: done dumping result, returning 28983 1726883045.70489: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affe814-3a2d-b16d-c0a7-00000000127e] 28983 1726883045.70501: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000127e 28983 1726883045.70861: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000127e 28983 1726883045.70864: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution in __network_rh_distros", "skip_reason": "Conditional result was False" } 28983 1726883045.71025: no more pending results, returning what we have 28983 1726883045.71029: results queue empty 28983 1726883045.71030: checking for any_errors_fatal 28983 1726883045.71036: done checking for any_errors_fatal 28983 1726883045.71037: checking for max_fail_percentage 28983 1726883045.71039: done checking for max_fail_percentage 28983 1726883045.71040: checking to see if all hosts have failed and the running result is not ok 28983 1726883045.71041: done checking to see if all hosts have failed 28983 1726883045.71042: getting the remaining hosts for this loop 28983 1726883045.71044: done getting the remaining hosts for this loop 28983 1726883045.71048: getting the next task for host managed_node2 28983 1726883045.71056: done getting next task for host managed_node2 28983 1726883045.71061: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 28983 1726883045.71067: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883045.71091: getting variables 28983 1726883045.71093: in VariableManager get_vars() 28983 1726883045.71132: Calling all_inventory to load vars for managed_node2 28983 1726883045.71137: Calling groups_inventory to load vars for managed_node2 28983 1726883045.71140: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883045.71150: Calling all_plugins_play to load vars for managed_node2 28983 1726883045.71154: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883045.71157: Calling groups_plugins_play to load vars for managed_node2 28983 1726883045.74090: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883045.78298: done with get_vars() 28983 1726883045.78538: done getting variables 28983 1726883045.78606: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:44:05 -0400 (0:00:00.134) 0:01:15.784 ****** 28983 1726883045.78647: entering _queue_task() for managed_node2/dnf 28983 1726883045.79203: worker is 1 (out of 1 available) 28983 1726883045.79216: exiting _queue_task() for managed_node2/dnf 28983 1726883045.79231: done queuing things up, now waiting for results queue to drain 28983 1726883045.79237: waiting for pending results... 28983 1726883045.79613: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 28983 1726883045.79793: in run() - task 0affe814-3a2d-b16d-c0a7-00000000127f 28983 1726883045.79810: variable 'ansible_search_path' from source: unknown 28983 1726883045.79813: variable 'ansible_search_path' from source: unknown 28983 1726883045.79862: calling self._execute() 28983 1726883045.79987: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883045.79991: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883045.80097: variable 'omit' from source: magic vars 28983 1726883045.80493: variable 'ansible_distribution_major_version' from source: facts 28983 1726883045.80507: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883045.80788: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883045.84655: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883045.84752: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883045.84795: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883045.84845: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883045.84875: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883045.84973: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883045.85013: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883045.85049: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883045.85104: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883045.85123: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883045.85295: variable 'ansible_distribution' from source: facts 28983 1726883045.85299: variable 'ansible_distribution_major_version' from source: facts 28983 1726883045.85302: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 28983 1726883045.85431: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883045.86005: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883045.86239: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883045.86245: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883045.86320: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883045.86338: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883045.86503: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883045.86535: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883045.86565: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883045.86639: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883045.86658: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883045.86871: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883045.86976: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883045.87098: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883045.87201: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883045.87404: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883045.88153: variable 'network_connections' from source: include params 28983 1726883045.88367: variable 'interface' from source: play vars 28983 1726883045.88494: variable 'interface' from source: play vars 28983 1726883045.88732: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883045.89263: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883045.89422: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883045.89516: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883045.89555: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883045.89642: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883045.89699: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883045.89944: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883045.89948: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883045.89951: variable '__network_team_connections_defined' from source: role '' defaults 28983 1726883045.90469: variable 'network_connections' from source: include params 28983 1726883045.90477: variable 'interface' from source: play vars 28983 1726883045.90659: variable 'interface' from source: play vars 28983 1726883045.90687: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 28983 1726883045.90691: when evaluation is False, skipping this task 28983 1726883045.90694: _execute() done 28983 1726883045.90699: dumping result to json 28983 1726883045.90704: done dumping result, returning 28983 1726883045.90715: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affe814-3a2d-b16d-c0a7-00000000127f] 28983 1726883045.90721: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000127f 28983 1726883045.90831: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000127f 28983 1726883045.90837: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 28983 1726883045.90922: no more pending results, returning what we have 28983 1726883045.90926: results queue empty 28983 1726883045.90927: checking for any_errors_fatal 28983 1726883045.90938: done checking for any_errors_fatal 28983 1726883045.90939: checking for max_fail_percentage 28983 1726883045.90941: done checking for max_fail_percentage 28983 1726883045.90942: checking to see if all hosts have failed and the running result is not ok 28983 1726883045.90943: done checking to see if all hosts have failed 28983 1726883045.90943: getting the remaining hosts for this loop 28983 1726883045.90946: done getting the remaining hosts for this loop 28983 1726883045.90951: getting the next task for host managed_node2 28983 1726883045.90959: done getting next task for host managed_node2 28983 1726883045.90963: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 28983 1726883045.90969: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883045.90994: getting variables 28983 1726883045.90996: in VariableManager get_vars() 28983 1726883045.91138: Calling all_inventory to load vars for managed_node2 28983 1726883045.91142: Calling groups_inventory to load vars for managed_node2 28983 1726883045.91145: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883045.91155: Calling all_plugins_play to load vars for managed_node2 28983 1726883045.91159: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883045.91163: Calling groups_plugins_play to load vars for managed_node2 28983 1726883045.93470: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883045.98465: done with get_vars() 28983 1726883045.98517: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 28983 1726883045.98646: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:44:05 -0400 (0:00:00.200) 0:01:15.984 ****** 28983 1726883045.98689: entering _queue_task() for managed_node2/yum 28983 1726883045.99149: worker is 1 (out of 1 available) 28983 1726883045.99162: exiting _queue_task() for managed_node2/yum 28983 1726883045.99181: done queuing things up, now waiting for results queue to drain 28983 1726883045.99183: waiting for pending results... 28983 1726883045.99451: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 28983 1726883045.99740: in run() - task 0affe814-3a2d-b16d-c0a7-000000001280 28983 1726883045.99744: variable 'ansible_search_path' from source: unknown 28983 1726883045.99747: variable 'ansible_search_path' from source: unknown 28983 1726883045.99751: calling self._execute() 28983 1726883045.99819: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883045.99836: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883045.99857: variable 'omit' from source: magic vars 28983 1726883046.00939: variable 'ansible_distribution_major_version' from source: facts 28983 1726883046.00943: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883046.01201: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883046.05795: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883046.05932: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883046.06009: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883046.06089: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883046.06151: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883046.06255: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883046.06302: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883046.06353: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883046.06430: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883046.06478: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883046.06629: variable 'ansible_distribution_major_version' from source: facts 28983 1726883046.06655: Evaluated conditional (ansible_distribution_major_version | int < 8): False 28983 1726883046.06665: when evaluation is False, skipping this task 28983 1726883046.06683: _execute() done 28983 1726883046.06700: dumping result to json 28983 1726883046.06718: done dumping result, returning 28983 1726883046.06743: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affe814-3a2d-b16d-c0a7-000000001280] 28983 1726883046.06943: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001280 28983 1726883046.07030: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001280 28983 1726883046.07035: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 28983 1726883046.07103: no more pending results, returning what we have 28983 1726883046.07107: results queue empty 28983 1726883046.07108: checking for any_errors_fatal 28983 1726883046.07114: done checking for any_errors_fatal 28983 1726883046.07115: checking for max_fail_percentage 28983 1726883046.07117: done checking for max_fail_percentage 28983 1726883046.07118: checking to see if all hosts have failed and the running result is not ok 28983 1726883046.07119: done checking to see if all hosts have failed 28983 1726883046.07120: getting the remaining hosts for this loop 28983 1726883046.07122: done getting the remaining hosts for this loop 28983 1726883046.07128: getting the next task for host managed_node2 28983 1726883046.07138: done getting next task for host managed_node2 28983 1726883046.07144: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 28983 1726883046.07150: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883046.07177: getting variables 28983 1726883046.07179: in VariableManager get_vars() 28983 1726883046.07220: Calling all_inventory to load vars for managed_node2 28983 1726883046.07223: Calling groups_inventory to load vars for managed_node2 28983 1726883046.07225: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883046.07306: Calling all_plugins_play to load vars for managed_node2 28983 1726883046.07312: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883046.07318: Calling groups_plugins_play to load vars for managed_node2 28983 1726883046.10400: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883046.13427: done with get_vars() 28983 1726883046.13468: done getting variables 28983 1726883046.13543: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:44:06 -0400 (0:00:00.148) 0:01:16.133 ****** 28983 1726883046.13589: entering _queue_task() for managed_node2/fail 28983 1726883046.14156: worker is 1 (out of 1 available) 28983 1726883046.14167: exiting _queue_task() for managed_node2/fail 28983 1726883046.14181: done queuing things up, now waiting for results queue to drain 28983 1726883046.14183: waiting for pending results... 28983 1726883046.14661: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 28983 1726883046.14905: in run() - task 0affe814-3a2d-b16d-c0a7-000000001281 28983 1726883046.14926: variable 'ansible_search_path' from source: unknown 28983 1726883046.14981: variable 'ansible_search_path' from source: unknown 28983 1726883046.15026: calling self._execute() 28983 1726883046.15250: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883046.15351: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883046.15370: variable 'omit' from source: magic vars 28983 1726883046.15961: variable 'ansible_distribution_major_version' from source: facts 28983 1726883046.15984: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883046.16158: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883046.16437: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883046.19901: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883046.20005: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883046.20056: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883046.20107: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883046.20145: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883046.20247: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883046.20296: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883046.20332: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883046.20396: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883046.20419: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883046.20485: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883046.20524: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883046.20562: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883046.20622: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883046.20646: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883046.20702: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883046.20742: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883046.20777: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883046.20838: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883046.20860: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883046.21100: variable 'network_connections' from source: include params 28983 1726883046.21120: variable 'interface' from source: play vars 28983 1726883046.21209: variable 'interface' from source: play vars 28983 1726883046.21308: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883046.21521: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883046.21752: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883046.21805: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883046.21914: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883046.21918: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883046.21944: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883046.22064: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883046.22103: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883046.22166: variable '__network_team_connections_defined' from source: role '' defaults 28983 1726883046.22503: variable 'network_connections' from source: include params 28983 1726883046.22515: variable 'interface' from source: play vars 28983 1726883046.22598: variable 'interface' from source: play vars 28983 1726883046.22633: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 28983 1726883046.22787: when evaluation is False, skipping this task 28983 1726883046.22791: _execute() done 28983 1726883046.22794: dumping result to json 28983 1726883046.22796: done dumping result, returning 28983 1726883046.22799: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affe814-3a2d-b16d-c0a7-000000001281] 28983 1726883046.22801: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001281 skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 28983 1726883046.23157: no more pending results, returning what we have 28983 1726883046.23161: results queue empty 28983 1726883046.23162: checking for any_errors_fatal 28983 1726883046.23173: done checking for any_errors_fatal 28983 1726883046.23174: checking for max_fail_percentage 28983 1726883046.23176: done checking for max_fail_percentage 28983 1726883046.23178: checking to see if all hosts have failed and the running result is not ok 28983 1726883046.23179: done checking to see if all hosts have failed 28983 1726883046.23180: getting the remaining hosts for this loop 28983 1726883046.23182: done getting the remaining hosts for this loop 28983 1726883046.23188: getting the next task for host managed_node2 28983 1726883046.23199: done getting next task for host managed_node2 28983 1726883046.23204: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 28983 1726883046.23211: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883046.23242: getting variables 28983 1726883046.23244: in VariableManager get_vars() 28983 1726883046.23292: Calling all_inventory to load vars for managed_node2 28983 1726883046.23295: Calling groups_inventory to load vars for managed_node2 28983 1726883046.23298: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883046.23309: Calling all_plugins_play to load vars for managed_node2 28983 1726883046.23313: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883046.23317: Calling groups_plugins_play to load vars for managed_node2 28983 1726883046.23851: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001281 28983 1726883046.23855: WORKER PROCESS EXITING 28983 1726883046.25870: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883046.30852: done with get_vars() 28983 1726883046.30892: done getting variables 28983 1726883046.30961: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:44:06 -0400 (0:00:00.174) 0:01:16.307 ****** 28983 1726883046.31006: entering _queue_task() for managed_node2/package 28983 1726883046.31346: worker is 1 (out of 1 available) 28983 1726883046.31360: exiting _queue_task() for managed_node2/package 28983 1726883046.31376: done queuing things up, now waiting for results queue to drain 28983 1726883046.31378: waiting for pending results... 28983 1726883046.31711: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 28983 1726883046.31884: in run() - task 0affe814-3a2d-b16d-c0a7-000000001282 28983 1726883046.31899: variable 'ansible_search_path' from source: unknown 28983 1726883046.31903: variable 'ansible_search_path' from source: unknown 28983 1726883046.31949: calling self._execute() 28983 1726883046.32067: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883046.32077: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883046.32089: variable 'omit' from source: magic vars 28983 1726883046.32532: variable 'ansible_distribution_major_version' from source: facts 28983 1726883046.32547: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883046.32804: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883046.33475: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883046.33690: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883046.33693: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883046.33877: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883046.34124: variable 'network_packages' from source: role '' defaults 28983 1726883046.34547: variable '__network_provider_setup' from source: role '' defaults 28983 1726883046.34560: variable '__network_service_name_default_nm' from source: role '' defaults 28983 1726883046.34713: variable '__network_service_name_default_nm' from source: role '' defaults 28983 1726883046.34839: variable '__network_packages_default_nm' from source: role '' defaults 28983 1726883046.34850: variable '__network_packages_default_nm' from source: role '' defaults 28983 1726883046.35132: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883046.38361: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883046.38530: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883046.38853: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883046.38857: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883046.39005: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883046.39168: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883046.39238: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883046.39621: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883046.39627: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883046.39632: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883046.40115: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883046.40150: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883046.40227: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883046.40820: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883046.40849: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883046.41560: variable '__network_packages_default_gobject_packages' from source: role '' defaults 28983 1726883046.41763: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883046.41793: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883046.41830: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883046.41971: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883046.41990: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883046.42203: variable 'ansible_python' from source: facts 28983 1726883046.42229: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 28983 1726883046.42554: variable '__network_wpa_supplicant_required' from source: role '' defaults 28983 1726883046.42558: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 28983 1726883046.42741: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883046.42772: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883046.42800: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883046.42949: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883046.42953: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883046.42955: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883046.42965: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883046.43001: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883046.43103: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883046.43121: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883046.43540: variable 'network_connections' from source: include params 28983 1726883046.43544: variable 'interface' from source: play vars 28983 1726883046.43547: variable 'interface' from source: play vars 28983 1726883046.43591: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883046.43706: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883046.43752: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883046.43821: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883046.43878: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883046.44553: variable 'network_connections' from source: include params 28983 1726883046.44558: variable 'interface' from source: play vars 28983 1726883046.44691: variable 'interface' from source: play vars 28983 1726883046.44843: variable '__network_packages_default_wireless' from source: role '' defaults 28983 1726883046.45058: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883046.46482: variable 'network_connections' from source: include params 28983 1726883046.46492: variable 'interface' from source: play vars 28983 1726883046.46781: variable 'interface' from source: play vars 28983 1726883046.46810: variable '__network_packages_default_team' from source: role '' defaults 28983 1726883046.47541: variable '__network_team_connections_defined' from source: role '' defaults 28983 1726883046.48439: variable 'network_connections' from source: include params 28983 1726883046.48452: variable 'interface' from source: play vars 28983 1726883046.48656: variable 'interface' from source: play vars 28983 1726883046.48739: variable '__network_service_name_default_initscripts' from source: role '' defaults 28983 1726883046.48855: variable '__network_service_name_default_initscripts' from source: role '' defaults 28983 1726883046.49009: variable '__network_packages_default_initscripts' from source: role '' defaults 28983 1726883046.49091: variable '__network_packages_default_initscripts' from source: role '' defaults 28983 1726883046.49845: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 28983 1726883046.51375: variable 'network_connections' from source: include params 28983 1726883046.51388: variable 'interface' from source: play vars 28983 1726883046.51716: variable 'interface' from source: play vars 28983 1726883046.51825: variable 'ansible_distribution' from source: facts 28983 1726883046.51828: variable '__network_rh_distros' from source: role '' defaults 28983 1726883046.51831: variable 'ansible_distribution_major_version' from source: facts 28983 1726883046.51835: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 28983 1726883046.52129: variable 'ansible_distribution' from source: facts 28983 1726883046.52248: variable '__network_rh_distros' from source: role '' defaults 28983 1726883046.52265: variable 'ansible_distribution_major_version' from source: facts 28983 1726883046.52282: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 28983 1726883046.52751: variable 'ansible_distribution' from source: facts 28983 1726883046.52762: variable '__network_rh_distros' from source: role '' defaults 28983 1726883046.52822: variable 'ansible_distribution_major_version' from source: facts 28983 1726883046.52883: variable 'network_provider' from source: set_fact 28983 1726883046.53052: variable 'ansible_facts' from source: unknown 28983 1726883046.55674: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 28983 1726883046.55741: when evaluation is False, skipping this task 28983 1726883046.55751: _execute() done 28983 1726883046.55760: dumping result to json 28983 1726883046.55769: done dumping result, returning 28983 1726883046.55787: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [0affe814-3a2d-b16d-c0a7-000000001282] 28983 1726883046.55804: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001282 skipping: [managed_node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 28983 1726883046.56078: no more pending results, returning what we have 28983 1726883046.56083: results queue empty 28983 1726883046.56085: checking for any_errors_fatal 28983 1726883046.56094: done checking for any_errors_fatal 28983 1726883046.56095: checking for max_fail_percentage 28983 1726883046.56097: done checking for max_fail_percentage 28983 1726883046.56099: checking to see if all hosts have failed and the running result is not ok 28983 1726883046.56100: done checking to see if all hosts have failed 28983 1726883046.56101: getting the remaining hosts for this loop 28983 1726883046.56103: done getting the remaining hosts for this loop 28983 1726883046.56109: getting the next task for host managed_node2 28983 1726883046.56120: done getting next task for host managed_node2 28983 1726883046.56125: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 28983 1726883046.56132: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883046.56163: getting variables 28983 1726883046.56165: in VariableManager get_vars() 28983 1726883046.56216: Calling all_inventory to load vars for managed_node2 28983 1726883046.56220: Calling groups_inventory to load vars for managed_node2 28983 1726883046.56228: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883046.56444: Calling all_plugins_play to load vars for managed_node2 28983 1726883046.56450: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883046.56456: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001282 28983 1726883046.56459: WORKER PROCESS EXITING 28983 1726883046.56464: Calling groups_plugins_play to load vars for managed_node2 28983 1726883046.61990: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883046.68231: done with get_vars() 28983 1726883046.68278: done getting variables 28983 1726883046.68556: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:44:06 -0400 (0:00:00.375) 0:01:16.683 ****** 28983 1726883046.68599: entering _queue_task() for managed_node2/package 28983 1726883046.69263: worker is 1 (out of 1 available) 28983 1726883046.69278: exiting _queue_task() for managed_node2/package 28983 1726883046.69295: done queuing things up, now waiting for results queue to drain 28983 1726883046.69297: waiting for pending results... 28983 1726883046.70046: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 28983 1726883046.70809: in run() - task 0affe814-3a2d-b16d-c0a7-000000001283 28983 1726883046.70825: variable 'ansible_search_path' from source: unknown 28983 1726883046.70829: variable 'ansible_search_path' from source: unknown 28983 1726883046.70896: calling self._execute() 28983 1726883046.71243: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883046.71251: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883046.71267: variable 'omit' from source: magic vars 28983 1726883046.73029: variable 'ansible_distribution_major_version' from source: facts 28983 1726883046.73046: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883046.73323: variable 'network_state' from source: role '' defaults 28983 1726883046.73606: Evaluated conditional (network_state != {}): False 28983 1726883046.73610: when evaluation is False, skipping this task 28983 1726883046.73613: _execute() done 28983 1726883046.73619: dumping result to json 28983 1726883046.73623: done dumping result, returning 28983 1726883046.73634: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affe814-3a2d-b16d-c0a7-000000001283] 28983 1726883046.73642: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001283 28983 1726883046.73892: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001283 28983 1726883046.73895: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28983 1726883046.73955: no more pending results, returning what we have 28983 1726883046.73960: results queue empty 28983 1726883046.73961: checking for any_errors_fatal 28983 1726883046.73969: done checking for any_errors_fatal 28983 1726883046.73970: checking for max_fail_percentage 28983 1726883046.73972: done checking for max_fail_percentage 28983 1726883046.73973: checking to see if all hosts have failed and the running result is not ok 28983 1726883046.73974: done checking to see if all hosts have failed 28983 1726883046.73975: getting the remaining hosts for this loop 28983 1726883046.73978: done getting the remaining hosts for this loop 28983 1726883046.73983: getting the next task for host managed_node2 28983 1726883046.73993: done getting next task for host managed_node2 28983 1726883046.73998: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 28983 1726883046.74004: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883046.74038: getting variables 28983 1726883046.74040: in VariableManager get_vars() 28983 1726883046.74091: Calling all_inventory to load vars for managed_node2 28983 1726883046.74095: Calling groups_inventory to load vars for managed_node2 28983 1726883046.74098: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883046.74110: Calling all_plugins_play to load vars for managed_node2 28983 1726883046.74114: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883046.74118: Calling groups_plugins_play to load vars for managed_node2 28983 1726883046.79085: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883046.85251: done with get_vars() 28983 1726883046.85308: done getting variables 28983 1726883046.85382: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:44:06 -0400 (0:00:00.168) 0:01:16.852 ****** 28983 1726883046.85428: entering _queue_task() for managed_node2/package 28983 1726883046.86540: worker is 1 (out of 1 available) 28983 1726883046.86553: exiting _queue_task() for managed_node2/package 28983 1726883046.86565: done queuing things up, now waiting for results queue to drain 28983 1726883046.86567: waiting for pending results... 28983 1726883046.87152: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 28983 1726883046.87370: in run() - task 0affe814-3a2d-b16d-c0a7-000000001284 28983 1726883046.87394: variable 'ansible_search_path' from source: unknown 28983 1726883046.87512: variable 'ansible_search_path' from source: unknown 28983 1726883046.87515: calling self._execute() 28983 1726883046.87942: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883046.87947: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883046.87950: variable 'omit' from source: magic vars 28983 1726883046.88799: variable 'ansible_distribution_major_version' from source: facts 28983 1726883046.88817: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883046.89138: variable 'network_state' from source: role '' defaults 28983 1726883046.89214: Evaluated conditional (network_state != {}): False 28983 1726883046.89223: when evaluation is False, skipping this task 28983 1726883046.89231: _execute() done 28983 1726883046.89242: dumping result to json 28983 1726883046.89315: done dumping result, returning 28983 1726883046.89335: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affe814-3a2d-b16d-c0a7-000000001284] 28983 1726883046.89349: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001284 skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28983 1726883046.89696: no more pending results, returning what we have 28983 1726883046.89700: results queue empty 28983 1726883046.89701: checking for any_errors_fatal 28983 1726883046.89709: done checking for any_errors_fatal 28983 1726883046.89710: checking for max_fail_percentage 28983 1726883046.89712: done checking for max_fail_percentage 28983 1726883046.89713: checking to see if all hosts have failed and the running result is not ok 28983 1726883046.89714: done checking to see if all hosts have failed 28983 1726883046.89715: getting the remaining hosts for this loop 28983 1726883046.89717: done getting the remaining hosts for this loop 28983 1726883046.89723: getting the next task for host managed_node2 28983 1726883046.89732: done getting next task for host managed_node2 28983 1726883046.89741: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 28983 1726883046.89748: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883046.89781: getting variables 28983 1726883046.89784: in VariableManager get_vars() 28983 1726883046.89833: Calling all_inventory to load vars for managed_node2 28983 1726883046.89940: Calling groups_inventory to load vars for managed_node2 28983 1726883046.89948: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883046.89956: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001284 28983 1726883046.89959: WORKER PROCESS EXITING 28983 1726883046.89975: Calling all_plugins_play to load vars for managed_node2 28983 1726883046.89980: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883046.89985: Calling groups_plugins_play to load vars for managed_node2 28983 1726883046.95116: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883047.01300: done with get_vars() 28983 1726883047.01508: done getting variables 28983 1726883047.01609: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:44:07 -0400 (0:00:00.163) 0:01:17.015 ****** 28983 1726883047.01776: entering _queue_task() for managed_node2/service 28983 1726883047.02792: worker is 1 (out of 1 available) 28983 1726883047.02811: exiting _queue_task() for managed_node2/service 28983 1726883047.02826: done queuing things up, now waiting for results queue to drain 28983 1726883047.02828: waiting for pending results... 28983 1726883047.03513: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 28983 1726883047.04073: in run() - task 0affe814-3a2d-b16d-c0a7-000000001285 28983 1726883047.04363: variable 'ansible_search_path' from source: unknown 28983 1726883047.04366: variable 'ansible_search_path' from source: unknown 28983 1726883047.04398: calling self._execute() 28983 1726883047.04657: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883047.04672: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883047.04728: variable 'omit' from source: magic vars 28983 1726883047.05212: variable 'ansible_distribution_major_version' from source: facts 28983 1726883047.05232: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883047.05408: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883047.05693: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883047.09936: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883047.10029: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883047.10084: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883047.10152: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883047.10181: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883047.10370: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883047.10373: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883047.10409: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883047.10482: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883047.10507: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883047.10592: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883047.10645: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883047.10683: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883047.10744: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883047.10809: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883047.10840: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883047.10882: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883047.10967: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883047.10988: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883047.11010: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883047.11450: variable 'network_connections' from source: include params 28983 1726883047.11453: variable 'interface' from source: play vars 28983 1726883047.11668: variable 'interface' from source: play vars 28983 1726883047.11824: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883047.12211: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883047.12424: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883047.12539: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883047.12799: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883047.12804: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883047.12840: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883047.12946: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883047.12990: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883047.13236: variable '__network_team_connections_defined' from source: role '' defaults 28983 1726883047.13905: variable 'network_connections' from source: include params 28983 1726883047.14001: variable 'interface' from source: play vars 28983 1726883047.14084: variable 'interface' from source: play vars 28983 1726883047.14169: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 28983 1726883047.14217: when evaluation is False, skipping this task 28983 1726883047.14226: _execute() done 28983 1726883047.14236: dumping result to json 28983 1726883047.14245: done dumping result, returning 28983 1726883047.14337: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affe814-3a2d-b16d-c0a7-000000001285] 28983 1726883047.14350: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001285 skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 28983 1726883047.14608: no more pending results, returning what we have 28983 1726883047.14616: results queue empty 28983 1726883047.14617: checking for any_errors_fatal 28983 1726883047.14629: done checking for any_errors_fatal 28983 1726883047.14630: checking for max_fail_percentage 28983 1726883047.14633: done checking for max_fail_percentage 28983 1726883047.14639: checking to see if all hosts have failed and the running result is not ok 28983 1726883047.14640: done checking to see if all hosts have failed 28983 1726883047.14641: getting the remaining hosts for this loop 28983 1726883047.14644: done getting the remaining hosts for this loop 28983 1726883047.14650: getting the next task for host managed_node2 28983 1726883047.14660: done getting next task for host managed_node2 28983 1726883047.14666: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 28983 1726883047.14672: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883047.14704: getting variables 28983 1726883047.14706: in VariableManager get_vars() 28983 1726883047.15084: Calling all_inventory to load vars for managed_node2 28983 1726883047.15088: Calling groups_inventory to load vars for managed_node2 28983 1726883047.15090: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883047.15245: Calling all_plugins_play to load vars for managed_node2 28983 1726883047.15250: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883047.15254: Calling groups_plugins_play to load vars for managed_node2 28983 1726883047.16005: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001285 28983 1726883047.16009: WORKER PROCESS EXITING 28983 1726883047.22009: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883047.30062: done with get_vars() 28983 1726883047.30124: done getting variables 28983 1726883047.30402: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:44:07 -0400 (0:00:00.286) 0:01:17.302 ****** 28983 1726883047.30448: entering _queue_task() for managed_node2/service 28983 1726883047.31245: worker is 1 (out of 1 available) 28983 1726883047.31261: exiting _queue_task() for managed_node2/service 28983 1726883047.31276: done queuing things up, now waiting for results queue to drain 28983 1726883047.31278: waiting for pending results... 28983 1726883047.31933: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 28983 1726883047.32216: in run() - task 0affe814-3a2d-b16d-c0a7-000000001286 28983 1726883047.32231: variable 'ansible_search_path' from source: unknown 28983 1726883047.32237: variable 'ansible_search_path' from source: unknown 28983 1726883047.32280: calling self._execute() 28983 1726883047.32726: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883047.32733: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883047.32737: variable 'omit' from source: magic vars 28983 1726883047.33737: variable 'ansible_distribution_major_version' from source: facts 28983 1726883047.33742: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883047.34242: variable 'network_provider' from source: set_fact 28983 1726883047.34246: variable 'network_state' from source: role '' defaults 28983 1726883047.34249: Evaluated conditional (network_provider == "nm" or network_state != {}): True 28983 1726883047.34251: variable 'omit' from source: magic vars 28983 1726883047.34511: variable 'omit' from source: magic vars 28983 1726883047.34514: variable 'network_service_name' from source: role '' defaults 28983 1726883047.34662: variable 'network_service_name' from source: role '' defaults 28983 1726883047.34918: variable '__network_provider_setup' from source: role '' defaults 28983 1726883047.34949: variable '__network_service_name_default_nm' from source: role '' defaults 28983 1726883047.35124: variable '__network_service_name_default_nm' from source: role '' defaults 28983 1726883047.35142: variable '__network_packages_default_nm' from source: role '' defaults 28983 1726883047.35312: variable '__network_packages_default_nm' from source: role '' defaults 28983 1726883047.35991: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883047.42517: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883047.42784: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883047.42902: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883047.43155: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883047.43158: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883047.43382: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883047.43386: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883047.43460: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883047.43573: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883047.43706: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883047.43840: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883047.43858: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883047.43896: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883047.44141: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883047.44145: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883047.44733: variable '__network_packages_default_gobject_packages' from source: role '' defaults 28983 1726883047.45074: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883047.45161: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883047.45308: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883047.45558: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883047.45562: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883047.45738: variable 'ansible_python' from source: facts 28983 1726883047.45764: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 28983 1726883047.46036: variable '__network_wpa_supplicant_required' from source: role '' defaults 28983 1726883047.46260: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 28983 1726883047.46666: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883047.46700: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883047.46733: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883047.46940: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883047.46943: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883047.47073: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883047.47119: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883047.47345: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883047.47348: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883047.47383: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883047.47923: variable 'network_connections' from source: include params 28983 1726883047.47996: variable 'interface' from source: play vars 28983 1726883047.48055: variable 'interface' from source: play vars 28983 1726883047.48840: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883047.49373: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883047.49547: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883047.49884: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883047.49946: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883047.50217: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883047.50317: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883047.50366: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883047.50547: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883047.50669: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883047.51554: variable 'network_connections' from source: include params 28983 1726883047.51568: variable 'interface' from source: play vars 28983 1726883047.51740: variable 'interface' from source: play vars 28983 1726883047.51823: variable '__network_packages_default_wireless' from source: role '' defaults 28983 1726883047.52040: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883047.53344: variable 'network_connections' from source: include params 28983 1726883047.53347: variable 'interface' from source: play vars 28983 1726883047.53670: variable 'interface' from source: play vars 28983 1726883047.53674: variable '__network_packages_default_team' from source: role '' defaults 28983 1726883047.54001: variable '__network_team_connections_defined' from source: role '' defaults 28983 1726883047.55117: variable 'network_connections' from source: include params 28983 1726883047.55206: variable 'interface' from source: play vars 28983 1726883047.55295: variable 'interface' from source: play vars 28983 1726883047.55743: variable '__network_service_name_default_initscripts' from source: role '' defaults 28983 1726883047.55924: variable '__network_service_name_default_initscripts' from source: role '' defaults 28983 1726883047.55941: variable '__network_packages_default_initscripts' from source: role '' defaults 28983 1726883047.56217: variable '__network_packages_default_initscripts' from source: role '' defaults 28983 1726883047.57045: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 28983 1726883047.59487: variable 'network_connections' from source: include params 28983 1726883047.59499: variable 'interface' from source: play vars 28983 1726883047.59695: variable 'interface' from source: play vars 28983 1726883047.59799: variable 'ansible_distribution' from source: facts 28983 1726883047.59809: variable '__network_rh_distros' from source: role '' defaults 28983 1726883047.59820: variable 'ansible_distribution_major_version' from source: facts 28983 1726883047.59914: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 28983 1726883047.60899: variable 'ansible_distribution' from source: facts 28983 1726883047.60905: variable '__network_rh_distros' from source: role '' defaults 28983 1726883047.60908: variable 'ansible_distribution_major_version' from source: facts 28983 1726883047.61059: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 28983 1726883047.61831: variable 'ansible_distribution' from source: facts 28983 1726883047.61939: variable '__network_rh_distros' from source: role '' defaults 28983 1726883047.61991: variable 'ansible_distribution_major_version' from source: facts 28983 1726883047.62186: variable 'network_provider' from source: set_fact 28983 1726883047.62374: variable 'omit' from source: magic vars 28983 1726883047.62755: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883047.62760: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883047.62785: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883047.62964: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883047.63000: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883047.63129: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883047.63177: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883047.63352: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883047.63641: Set connection var ansible_connection to ssh 28983 1726883047.63645: Set connection var ansible_shell_executable to /bin/sh 28983 1726883047.63648: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883047.63714: Set connection var ansible_timeout to 10 28983 1726883047.63727: Set connection var ansible_pipelining to False 28983 1726883047.63742: Set connection var ansible_shell_type to sh 28983 1726883047.63867: variable 'ansible_shell_executable' from source: unknown 28983 1726883047.63881: variable 'ansible_connection' from source: unknown 28983 1726883047.63990: variable 'ansible_module_compression' from source: unknown 28983 1726883047.63994: variable 'ansible_shell_type' from source: unknown 28983 1726883047.63996: variable 'ansible_shell_executable' from source: unknown 28983 1726883047.63998: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883047.64008: variable 'ansible_pipelining' from source: unknown 28983 1726883047.64016: variable 'ansible_timeout' from source: unknown 28983 1726883047.64043: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883047.64414: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883047.64632: variable 'omit' from source: magic vars 28983 1726883047.64637: starting attempt loop 28983 1726883047.64640: running the handler 28983 1726883047.64741: variable 'ansible_facts' from source: unknown 28983 1726883047.67642: _low_level_execute_command(): starting 28983 1726883047.67645: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726883047.69099: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883047.69191: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883047.69348: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883047.69474: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883047.71288: stdout chunk (state=3): >>>/root <<< 28983 1726883047.71478: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883047.71640: stderr chunk (state=3): >>><<< 28983 1726883047.71643: stdout chunk (state=3): >>><<< 28983 1726883047.71814: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883047.71817: _low_level_execute_command(): starting 28983 1726883047.71821: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726883047.7166576-31780-34452905848645 `" && echo ansible-tmp-1726883047.7166576-31780-34452905848645="` echo /root/.ansible/tmp/ansible-tmp-1726883047.7166576-31780-34452905848645 `" ) && sleep 0' 28983 1726883047.72630: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883047.72642: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726883047.72646: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 28983 1726883047.72648: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883047.72651: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883047.72786: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883047.72789: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883047.72955: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883047.74958: stdout chunk (state=3): >>>ansible-tmp-1726883047.7166576-31780-34452905848645=/root/.ansible/tmp/ansible-tmp-1726883047.7166576-31780-34452905848645 <<< 28983 1726883047.75180: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883047.75184: stdout chunk (state=3): >>><<< 28983 1726883047.75193: stderr chunk (state=3): >>><<< 28983 1726883047.75228: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726883047.7166576-31780-34452905848645=/root/.ansible/tmp/ansible-tmp-1726883047.7166576-31780-34452905848645 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883047.75271: variable 'ansible_module_compression' from source: unknown 28983 1726883047.75331: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 28983 1726883047.75556: variable 'ansible_facts' from source: unknown 28983 1726883047.75967: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726883047.7166576-31780-34452905848645/AnsiballZ_systemd.py 28983 1726883047.76361: Sending initial data 28983 1726883047.76365: Sent initial data (155 bytes) 28983 1726883047.77536: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883047.77594: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883047.77719: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883047.77772: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883047.77857: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883047.79567: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726883047.79659: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726883047.79748: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmprbf5jgtn /root/.ansible/tmp/ansible-tmp-1726883047.7166576-31780-34452905848645/AnsiballZ_systemd.py <<< 28983 1726883047.79752: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726883047.7166576-31780-34452905848645/AnsiballZ_systemd.py" <<< 28983 1726883047.79828: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmprbf5jgtn" to remote "/root/.ansible/tmp/ansible-tmp-1726883047.7166576-31780-34452905848645/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726883047.7166576-31780-34452905848645/AnsiballZ_systemd.py" <<< 28983 1726883047.85477: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883047.85527: stderr chunk (state=3): >>><<< 28983 1726883047.85542: stdout chunk (state=3): >>><<< 28983 1726883047.85684: done transferring module to remote 28983 1726883047.85688: _low_level_execute_command(): starting 28983 1726883047.85691: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726883047.7166576-31780-34452905848645/ /root/.ansible/tmp/ansible-tmp-1726883047.7166576-31780-34452905848645/AnsiballZ_systemd.py && sleep 0' 28983 1726883047.86355: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883047.86459: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 28983 1726883047.86479: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883047.86498: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883047.86679: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883047.88943: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883047.88947: stdout chunk (state=3): >>><<< 28983 1726883047.88949: stderr chunk (state=3): >>><<< 28983 1726883047.88952: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883047.88955: _low_level_execute_command(): starting 28983 1726883047.89129: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726883047.7166576-31780-34452905848645/AnsiballZ_systemd.py && sleep 0' 28983 1726883047.90359: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883047.90596: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883047.90616: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883047.90630: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883047.90741: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883048.23390: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3307", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ExecMainStartTimestampMonotonic": "237115079", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3307", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4837", "MemoryCurrent": "4526080", "MemoryAvailable": "infinity", "CPUUsageNSec": "1585237000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "<<< 28983 1726883048.23420: stdout chunk (state=3): >>>infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target NetworkManager-wait-online.service cloud-init.service network.target network.service multi-user.target", "After": "systemd-journald.socket dbus-broker.service dbus.socket system.slice cloud-init-local.service sysinit.target network-pre.target basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": <<< 28983 1726883048.23440: stdout chunk (state=3): >>>"loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:40:53 EDT", "StateChangeTimestampMonotonic": "816797668", "InactiveExitTimestamp": "Fri 2024-09-20 21:31:13 EDT", "InactiveExitTimestampMonotonic": "237115438", "ActiveEnterTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ActiveEnterTimestampMonotonic": "237210316", "ActiveExitTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ActiveExitTimestampMonotonic": "237082173", "InactiveEnterTimestamp": "Fri 2024-09-20 21:31:13 EDT", "InactiveEnterTimestampMonotonic": "237109124", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ConditionTimestampMonotonic": "237109810", "AssertTimestamp": "Fri 2024-09-20 21:31:13 EDT", "AssertTimestampMonotonic": "237109813", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ac5f6302898c463683c4bdf580ab7e3e", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 28983 1726883048.25375: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. <<< 28983 1726883048.25437: stderr chunk (state=3): >>><<< 28983 1726883048.25440: stdout chunk (state=3): >>><<< 28983 1726883048.25475: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3307", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ExecMainStartTimestampMonotonic": "237115079", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3307", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4837", "MemoryCurrent": "4526080", "MemoryAvailable": "infinity", "CPUUsageNSec": "1585237000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target NetworkManager-wait-online.service cloud-init.service network.target network.service multi-user.target", "After": "systemd-journald.socket dbus-broker.service dbus.socket system.slice cloud-init-local.service sysinit.target network-pre.target basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:40:53 EDT", "StateChangeTimestampMonotonic": "816797668", "InactiveExitTimestamp": "Fri 2024-09-20 21:31:13 EDT", "InactiveExitTimestampMonotonic": "237115438", "ActiveEnterTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ActiveEnterTimestampMonotonic": "237210316", "ActiveExitTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ActiveExitTimestampMonotonic": "237082173", "InactiveEnterTimestamp": "Fri 2024-09-20 21:31:13 EDT", "InactiveEnterTimestampMonotonic": "237109124", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ConditionTimestampMonotonic": "237109810", "AssertTimestamp": "Fri 2024-09-20 21:31:13 EDT", "AssertTimestampMonotonic": "237109813", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ac5f6302898c463683c4bdf580ab7e3e", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726883048.25665: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726883047.7166576-31780-34452905848645/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726883048.25683: _low_level_execute_command(): starting 28983 1726883048.25688: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726883047.7166576-31780-34452905848645/ > /dev/null 2>&1 && sleep 0' 28983 1726883048.26299: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883048.26302: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883048.26307: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883048.26321: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883048.26339: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883048.26387: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883048.26491: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883048.28576: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883048.28588: stdout chunk (state=3): >>><<< 28983 1726883048.28603: stderr chunk (state=3): >>><<< 28983 1726883048.28641: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883048.28655: handler run complete 28983 1726883048.28749: attempt loop complete, returning result 28983 1726883048.28759: _execute() done 28983 1726883048.28810: dumping result to json 28983 1726883048.28813: done dumping result, returning 28983 1726883048.28815: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affe814-3a2d-b16d-c0a7-000000001286] 28983 1726883048.28824: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001286 ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28983 1726883048.30248: no more pending results, returning what we have 28983 1726883048.30252: results queue empty 28983 1726883048.30253: checking for any_errors_fatal 28983 1726883048.30257: done checking for any_errors_fatal 28983 1726883048.30258: checking for max_fail_percentage 28983 1726883048.30261: done checking for max_fail_percentage 28983 1726883048.30262: checking to see if all hosts have failed and the running result is not ok 28983 1726883048.30267: done checking to see if all hosts have failed 28983 1726883048.30268: getting the remaining hosts for this loop 28983 1726883048.30270: done getting the remaining hosts for this loop 28983 1726883048.30276: getting the next task for host managed_node2 28983 1726883048.30285: done getting next task for host managed_node2 28983 1726883048.30291: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 28983 1726883048.30298: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883048.30306: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001286 28983 1726883048.30310: WORKER PROCESS EXITING 28983 1726883048.30483: getting variables 28983 1726883048.30485: in VariableManager get_vars() 28983 1726883048.30699: Calling all_inventory to load vars for managed_node2 28983 1726883048.30703: Calling groups_inventory to load vars for managed_node2 28983 1726883048.30706: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883048.30716: Calling all_plugins_play to load vars for managed_node2 28983 1726883048.30733: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883048.30745: Calling groups_plugins_play to load vars for managed_node2 28983 1726883048.35698: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883048.39755: done with get_vars() 28983 1726883048.39797: done getting variables 28983 1726883048.39878: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:44:08 -0400 (0:00:01.094) 0:01:18.397 ****** 28983 1726883048.39925: entering _queue_task() for managed_node2/service 28983 1726883048.40656: worker is 1 (out of 1 available) 28983 1726883048.40666: exiting _queue_task() for managed_node2/service 28983 1726883048.40677: done queuing things up, now waiting for results queue to drain 28983 1726883048.40679: waiting for pending results... 28983 1726883048.41056: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 28983 1726883048.41061: in run() - task 0affe814-3a2d-b16d-c0a7-000000001287 28983 1726883048.41065: variable 'ansible_search_path' from source: unknown 28983 1726883048.41068: variable 'ansible_search_path' from source: unknown 28983 1726883048.41147: calling self._execute() 28983 1726883048.41209: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883048.41216: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883048.41240: variable 'omit' from source: magic vars 28983 1726883048.41839: variable 'ansible_distribution_major_version' from source: facts 28983 1726883048.41844: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883048.41862: variable 'network_provider' from source: set_fact 28983 1726883048.41878: Evaluated conditional (network_provider == "nm"): True 28983 1726883048.42001: variable '__network_wpa_supplicant_required' from source: role '' defaults 28983 1726883048.42203: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 28983 1726883048.42390: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883048.45918: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883048.46001: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883048.46148: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883048.46153: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883048.46156: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883048.46232: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883048.46271: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883048.46305: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883048.46365: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883048.46380: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883048.46442: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883048.46476: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883048.46506: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883048.46559: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883048.46584: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883048.46632: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883048.46664: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883048.46693: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883048.46748: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883048.46765: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883048.46953: variable 'network_connections' from source: include params 28983 1726883048.46966: variable 'interface' from source: play vars 28983 1726883048.47181: variable 'interface' from source: play vars 28983 1726883048.47464: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883048.47920: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883048.47971: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883048.48012: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883048.48339: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883048.48343: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883048.48346: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883048.48377: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883048.48407: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883048.48597: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883048.49130: variable 'network_connections' from source: include params 28983 1726883048.49136: variable 'interface' from source: play vars 28983 1726883048.49211: variable 'interface' from source: play vars 28983 1726883048.49256: Evaluated conditional (__network_wpa_supplicant_required): False 28983 1726883048.49260: when evaluation is False, skipping this task 28983 1726883048.49263: _execute() done 28983 1726883048.49268: dumping result to json 28983 1726883048.49276: done dumping result, returning 28983 1726883048.49283: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affe814-3a2d-b16d-c0a7-000000001287] 28983 1726883048.49292: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001287 skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 28983 1726883048.49566: no more pending results, returning what we have 28983 1726883048.49569: results queue empty 28983 1726883048.49570: checking for any_errors_fatal 28983 1726883048.49599: done checking for any_errors_fatal 28983 1726883048.49600: checking for max_fail_percentage 28983 1726883048.49602: done checking for max_fail_percentage 28983 1726883048.49603: checking to see if all hosts have failed and the running result is not ok 28983 1726883048.49604: done checking to see if all hosts have failed 28983 1726883048.49605: getting the remaining hosts for this loop 28983 1726883048.49606: done getting the remaining hosts for this loop 28983 1726883048.49610: getting the next task for host managed_node2 28983 1726883048.49618: done getting next task for host managed_node2 28983 1726883048.49623: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 28983 1726883048.49628: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883048.49652: getting variables 28983 1726883048.49653: in VariableManager get_vars() 28983 1726883048.49737: Calling all_inventory to load vars for managed_node2 28983 1726883048.49740: Calling groups_inventory to load vars for managed_node2 28983 1726883048.49743: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883048.49753: Calling all_plugins_play to load vars for managed_node2 28983 1726883048.49757: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883048.49760: Calling groups_plugins_play to load vars for managed_node2 28983 1726883048.50293: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001287 28983 1726883048.50296: WORKER PROCESS EXITING 28983 1726883048.52584: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883048.55696: done with get_vars() 28983 1726883048.55742: done getting variables 28983 1726883048.55828: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:44:08 -0400 (0:00:00.159) 0:01:18.556 ****** 28983 1726883048.55878: entering _queue_task() for managed_node2/service 28983 1726883048.56361: worker is 1 (out of 1 available) 28983 1726883048.56375: exiting _queue_task() for managed_node2/service 28983 1726883048.56391: done queuing things up, now waiting for results queue to drain 28983 1726883048.56396: waiting for pending results... 28983 1726883048.56656: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 28983 1726883048.56766: in run() - task 0affe814-3a2d-b16d-c0a7-000000001288 28983 1726883048.56781: variable 'ansible_search_path' from source: unknown 28983 1726883048.56785: variable 'ansible_search_path' from source: unknown 28983 1726883048.56817: calling self._execute() 28983 1726883048.56909: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883048.56915: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883048.56926: variable 'omit' from source: magic vars 28983 1726883048.57264: variable 'ansible_distribution_major_version' from source: facts 28983 1726883048.57277: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883048.57381: variable 'network_provider' from source: set_fact 28983 1726883048.57387: Evaluated conditional (network_provider == "initscripts"): False 28983 1726883048.57392: when evaluation is False, skipping this task 28983 1726883048.57395: _execute() done 28983 1726883048.57398: dumping result to json 28983 1726883048.57403: done dumping result, returning 28983 1726883048.57416: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [0affe814-3a2d-b16d-c0a7-000000001288] 28983 1726883048.57419: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001288 28983 1726883048.57510: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001288 skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28983 1726883048.57574: no more pending results, returning what we have 28983 1726883048.57578: results queue empty 28983 1726883048.57579: checking for any_errors_fatal 28983 1726883048.57590: done checking for any_errors_fatal 28983 1726883048.57590: checking for max_fail_percentage 28983 1726883048.57593: done checking for max_fail_percentage 28983 1726883048.57594: checking to see if all hosts have failed and the running result is not ok 28983 1726883048.57595: done checking to see if all hosts have failed 28983 1726883048.57596: getting the remaining hosts for this loop 28983 1726883048.57598: done getting the remaining hosts for this loop 28983 1726883048.57602: getting the next task for host managed_node2 28983 1726883048.57610: done getting next task for host managed_node2 28983 1726883048.57615: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 28983 1726883048.57620: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883048.57647: getting variables 28983 1726883048.57649: in VariableManager get_vars() 28983 1726883048.57684: Calling all_inventory to load vars for managed_node2 28983 1726883048.57687: Calling groups_inventory to load vars for managed_node2 28983 1726883048.57689: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883048.57698: Calling all_plugins_play to load vars for managed_node2 28983 1726883048.57701: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883048.57705: Calling groups_plugins_play to load vars for managed_node2 28983 1726883048.58240: WORKER PROCESS EXITING 28983 1726883048.58968: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883048.62302: done with get_vars() 28983 1726883048.62325: done getting variables 28983 1726883048.62385: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:44:08 -0400 (0:00:00.065) 0:01:18.621 ****** 28983 1726883048.62415: entering _queue_task() for managed_node2/copy 28983 1726883048.62651: worker is 1 (out of 1 available) 28983 1726883048.62663: exiting _queue_task() for managed_node2/copy 28983 1726883048.62677: done queuing things up, now waiting for results queue to drain 28983 1726883048.62679: waiting for pending results... 28983 1726883048.62873: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 28983 1726883048.62973: in run() - task 0affe814-3a2d-b16d-c0a7-000000001289 28983 1726883048.62988: variable 'ansible_search_path' from source: unknown 28983 1726883048.62991: variable 'ansible_search_path' from source: unknown 28983 1726883048.63030: calling self._execute() 28983 1726883048.63116: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883048.63121: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883048.63138: variable 'omit' from source: magic vars 28983 1726883048.63489: variable 'ansible_distribution_major_version' from source: facts 28983 1726883048.63500: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883048.63624: variable 'network_provider' from source: set_fact 28983 1726883048.63630: Evaluated conditional (network_provider == "initscripts"): False 28983 1726883048.63635: when evaluation is False, skipping this task 28983 1726883048.63638: _execute() done 28983 1726883048.63645: dumping result to json 28983 1726883048.63657: done dumping result, returning 28983 1726883048.63662: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affe814-3a2d-b16d-c0a7-000000001289] 28983 1726883048.63680: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001289 28983 1726883048.63786: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001289 28983 1726883048.63790: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 28983 1726883048.63849: no more pending results, returning what we have 28983 1726883048.63853: results queue empty 28983 1726883048.63854: checking for any_errors_fatal 28983 1726883048.63860: done checking for any_errors_fatal 28983 1726883048.63861: checking for max_fail_percentage 28983 1726883048.63863: done checking for max_fail_percentage 28983 1726883048.63867: checking to see if all hosts have failed and the running result is not ok 28983 1726883048.63868: done checking to see if all hosts have failed 28983 1726883048.63869: getting the remaining hosts for this loop 28983 1726883048.63871: done getting the remaining hosts for this loop 28983 1726883048.63877: getting the next task for host managed_node2 28983 1726883048.63888: done getting next task for host managed_node2 28983 1726883048.63895: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 28983 1726883048.63901: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883048.63926: getting variables 28983 1726883048.63928: in VariableManager get_vars() 28983 1726883048.63969: Calling all_inventory to load vars for managed_node2 28983 1726883048.63972: Calling groups_inventory to load vars for managed_node2 28983 1726883048.63974: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883048.63983: Calling all_plugins_play to load vars for managed_node2 28983 1726883048.63987: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883048.63992: Calling groups_plugins_play to load vars for managed_node2 28983 1726883048.66732: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883048.72529: done with get_vars() 28983 1726883048.72808: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:44:08 -0400 (0:00:00.106) 0:01:18.728 ****** 28983 1726883048.73078: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 28983 1726883048.73541: worker is 1 (out of 1 available) 28983 1726883048.73557: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 28983 1726883048.73688: done queuing things up, now waiting for results queue to drain 28983 1726883048.73691: waiting for pending results... 28983 1726883048.73890: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 28983 1726883048.74154: in run() - task 0affe814-3a2d-b16d-c0a7-00000000128a 28983 1726883048.74159: variable 'ansible_search_path' from source: unknown 28983 1726883048.74162: variable 'ansible_search_path' from source: unknown 28983 1726883048.74218: calling self._execute() 28983 1726883048.74450: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883048.74458: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883048.74461: variable 'omit' from source: magic vars 28983 1726883048.74817: variable 'ansible_distribution_major_version' from source: facts 28983 1726883048.74829: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883048.74838: variable 'omit' from source: magic vars 28983 1726883048.74899: variable 'omit' from source: magic vars 28983 1726883048.75042: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883048.77672: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883048.77783: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883048.77791: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883048.77794: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883048.77836: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883048.77907: variable 'network_provider' from source: set_fact 28983 1726883048.78115: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883048.78160: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883048.78164: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883048.78268: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883048.78275: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883048.78369: variable 'omit' from source: magic vars 28983 1726883048.78619: variable 'omit' from source: magic vars 28983 1726883048.78624: variable 'network_connections' from source: include params 28983 1726883048.78729: variable 'interface' from source: play vars 28983 1726883048.78734: variable 'interface' from source: play vars 28983 1726883048.78980: variable 'omit' from source: magic vars 28983 1726883048.78987: variable '__lsr_ansible_managed' from source: task vars 28983 1726883048.79049: variable '__lsr_ansible_managed' from source: task vars 28983 1726883048.79266: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 28983 1726883048.86428: Loaded config def from plugin (lookup/template) 28983 1726883048.86433: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 28983 1726883048.86456: File lookup term: get_ansible_managed.j2 28983 1726883048.86460: variable 'ansible_search_path' from source: unknown 28983 1726883048.86464: evaluation_path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 28983 1726883048.86480: search_path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 28983 1726883048.86503: variable 'ansible_search_path' from source: unknown 28983 1726883048.96532: variable 'ansible_managed' from source: unknown 28983 1726883048.96596: variable 'omit' from source: magic vars 28983 1726883048.96625: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883048.96841: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883048.96847: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883048.96849: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883048.96852: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883048.96854: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883048.96863: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883048.96867: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883048.96869: Set connection var ansible_connection to ssh 28983 1726883048.96872: Set connection var ansible_shell_executable to /bin/sh 28983 1726883048.96876: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883048.96886: Set connection var ansible_timeout to 10 28983 1726883048.96893: Set connection var ansible_pipelining to False 28983 1726883048.96896: Set connection var ansible_shell_type to sh 28983 1726883048.96977: variable 'ansible_shell_executable' from source: unknown 28983 1726883048.96980: variable 'ansible_connection' from source: unknown 28983 1726883048.96983: variable 'ansible_module_compression' from source: unknown 28983 1726883048.96987: variable 'ansible_shell_type' from source: unknown 28983 1726883048.97085: variable 'ansible_shell_executable' from source: unknown 28983 1726883048.97088: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883048.97091: variable 'ansible_pipelining' from source: unknown 28983 1726883048.97093: variable 'ansible_timeout' from source: unknown 28983 1726883048.97095: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883048.97160: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28983 1726883048.97170: variable 'omit' from source: magic vars 28983 1726883048.97175: starting attempt loop 28983 1726883048.97178: running the handler 28983 1726883048.97180: _low_level_execute_command(): starting 28983 1726883048.97183: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726883048.97808: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883048.97969: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883048.97977: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883048.98053: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883048.99903: stdout chunk (state=3): >>>/root <<< 28983 1726883049.00386: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883049.00397: stdout chunk (state=3): >>><<< 28983 1726883049.00400: stderr chunk (state=3): >>><<< 28983 1726883049.00403: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883049.00406: _low_level_execute_command(): starting 28983 1726883049.00606: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726883049.0035508-31842-44457840696829 `" && echo ansible-tmp-1726883049.0035508-31842-44457840696829="` echo /root/.ansible/tmp/ansible-tmp-1726883049.0035508-31842-44457840696829 `" ) && sleep 0' 28983 1726883049.01776: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883049.01938: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883049.01960: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883049.01986: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883049.02012: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883049.02153: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883049.04299: stdout chunk (state=3): >>>ansible-tmp-1726883049.0035508-31842-44457840696829=/root/.ansible/tmp/ansible-tmp-1726883049.0035508-31842-44457840696829 <<< 28983 1726883049.04512: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883049.04571: stderr chunk (state=3): >>><<< 28983 1726883049.04610: stdout chunk (state=3): >>><<< 28983 1726883049.04949: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726883049.0035508-31842-44457840696829=/root/.ansible/tmp/ansible-tmp-1726883049.0035508-31842-44457840696829 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883049.04953: variable 'ansible_module_compression' from source: unknown 28983 1726883049.04956: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 28983 1726883049.04958: variable 'ansible_facts' from source: unknown 28983 1726883049.05241: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726883049.0035508-31842-44457840696829/AnsiballZ_network_connections.py 28983 1726883049.05664: Sending initial data 28983 1726883049.05675: Sent initial data (167 bytes) 28983 1726883049.07665: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883049.07855: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883049.07944: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883049.07968: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883049.08163: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883049.10043: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726883049.10151: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726883049.10224: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpp4bykx9w /root/.ansible/tmp/ansible-tmp-1726883049.0035508-31842-44457840696829/AnsiballZ_network_connections.py <<< 28983 1726883049.10237: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726883049.0035508-31842-44457840696829/AnsiballZ_network_connections.py" <<< 28983 1726883049.10381: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpp4bykx9w" to remote "/root/.ansible/tmp/ansible-tmp-1726883049.0035508-31842-44457840696829/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726883049.0035508-31842-44457840696829/AnsiballZ_network_connections.py" <<< 28983 1726883049.13183: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883049.13312: stderr chunk (state=3): >>><<< 28983 1726883049.13329: stdout chunk (state=3): >>><<< 28983 1726883049.13547: done transferring module to remote 28983 1726883049.13551: _low_level_execute_command(): starting 28983 1726883049.13554: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726883049.0035508-31842-44457840696829/ /root/.ansible/tmp/ansible-tmp-1726883049.0035508-31842-44457840696829/AnsiballZ_network_connections.py && sleep 0' 28983 1726883049.15255: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883049.15535: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883049.15645: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883049.17753: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883049.17757: stdout chunk (state=3): >>><<< 28983 1726883049.17759: stderr chunk (state=3): >>><<< 28983 1726883049.17762: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883049.17764: _low_level_execute_command(): starting 28983 1726883049.17767: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726883049.0035508-31842-44457840696829/AnsiballZ_network_connections.py && sleep 0' 28983 1726883049.18941: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883049.19040: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883049.19044: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883049.19047: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883049.19050: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726883049.19128: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883049.19666: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883049.19754: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883049.48533: stdout chunk (state=3): >>> {"changed": false, "warnings": [], "stderr": "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 3ac79eb6-77ee-484f-9752-0ce3ea88e423 skipped because already active\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 28983 1726883049.50430: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. <<< 28983 1726883049.50645: stderr chunk (state=3): >>><<< 28983 1726883049.50650: stdout chunk (state=3): >>><<< 28983 1726883049.50653: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "warnings": [], "stderr": "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 3ac79eb6-77ee-484f-9752-0ce3ea88e423 skipped because already active\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726883049.50656: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'state': 'up'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726883049.0035508-31842-44457840696829/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726883049.50661: _low_level_execute_command(): starting 28983 1726883049.50663: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726883049.0035508-31842-44457840696829/ > /dev/null 2>&1 && sleep 0' 28983 1726883049.51940: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883049.51944: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883049.51946: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883049.51949: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883049.51951: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726883049.51954: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883049.51982: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883049.52000: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883049.52029: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883049.52122: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883049.54082: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883049.54111: stderr chunk (state=3): >>><<< 28983 1726883049.54113: stdout chunk (state=3): >>><<< 28983 1726883049.54124: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883049.54139: handler run complete 28983 1726883049.54164: attempt loop complete, returning result 28983 1726883049.54167: _execute() done 28983 1726883049.54174: dumping result to json 28983 1726883049.54182: done dumping result, returning 28983 1726883049.54191: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affe814-3a2d-b16d-c0a7-00000000128a] 28983 1726883049.54196: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000128a 28983 1726883049.54318: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000128a 28983 1726883049.54324: WORKER PROCESS EXITING ok: [managed_node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "state": "up" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": false } STDERR: [002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 3ac79eb6-77ee-484f-9752-0ce3ea88e423 skipped because already active 28983 1726883049.54511: no more pending results, returning what we have 28983 1726883049.54515: results queue empty 28983 1726883049.54516: checking for any_errors_fatal 28983 1726883049.54522: done checking for any_errors_fatal 28983 1726883049.54523: checking for max_fail_percentage 28983 1726883049.54525: done checking for max_fail_percentage 28983 1726883049.54526: checking to see if all hosts have failed and the running result is not ok 28983 1726883049.54532: done checking to see if all hosts have failed 28983 1726883049.54533: getting the remaining hosts for this loop 28983 1726883049.54627: done getting the remaining hosts for this loop 28983 1726883049.54635: getting the next task for host managed_node2 28983 1726883049.54643: done getting next task for host managed_node2 28983 1726883049.54731: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 28983 1726883049.54738: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883049.54755: getting variables 28983 1726883049.54757: in VariableManager get_vars() 28983 1726883049.54824: Calling all_inventory to load vars for managed_node2 28983 1726883049.54828: Calling groups_inventory to load vars for managed_node2 28983 1726883049.54848: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883049.54859: Calling all_plugins_play to load vars for managed_node2 28983 1726883049.54863: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883049.54868: Calling groups_plugins_play to load vars for managed_node2 28983 1726883049.64116: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883049.66194: done with get_vars() 28983 1726883049.66231: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:44:09 -0400 (0:00:00.932) 0:01:19.660 ****** 28983 1726883049.66296: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 28983 1726883049.66592: worker is 1 (out of 1 available) 28983 1726883049.66612: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 28983 1726883049.66624: done queuing things up, now waiting for results queue to drain 28983 1726883049.66627: waiting for pending results... 28983 1726883049.66854: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 28983 1726883049.67022: in run() - task 0affe814-3a2d-b16d-c0a7-00000000128b 28983 1726883049.67030: variable 'ansible_search_path' from source: unknown 28983 1726883049.67035: variable 'ansible_search_path' from source: unknown 28983 1726883049.67070: calling self._execute() 28983 1726883049.67161: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883049.67168: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883049.67181: variable 'omit' from source: magic vars 28983 1726883049.67616: variable 'ansible_distribution_major_version' from source: facts 28983 1726883049.67627: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883049.67786: variable 'network_state' from source: role '' defaults 28983 1726883049.67795: Evaluated conditional (network_state != {}): False 28983 1726883049.67807: when evaluation is False, skipping this task 28983 1726883049.67813: _execute() done 28983 1726883049.67822: dumping result to json 28983 1726883049.67825: done dumping result, returning 28983 1726883049.67828: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [0affe814-3a2d-b16d-c0a7-00000000128b] 28983 1726883049.67830: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000128b 28983 1726883049.67958: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000128b 28983 1726883049.67962: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28983 1726883049.68033: no more pending results, returning what we have 28983 1726883049.68039: results queue empty 28983 1726883049.68040: checking for any_errors_fatal 28983 1726883049.68050: done checking for any_errors_fatal 28983 1726883049.68051: checking for max_fail_percentage 28983 1726883049.68053: done checking for max_fail_percentage 28983 1726883049.68054: checking to see if all hosts have failed and the running result is not ok 28983 1726883049.68055: done checking to see if all hosts have failed 28983 1726883049.68056: getting the remaining hosts for this loop 28983 1726883049.68058: done getting the remaining hosts for this loop 28983 1726883049.68062: getting the next task for host managed_node2 28983 1726883049.68070: done getting next task for host managed_node2 28983 1726883049.68076: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 28983 1726883049.68083: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883049.68114: getting variables 28983 1726883049.68116: in VariableManager get_vars() 28983 1726883049.68182: Calling all_inventory to load vars for managed_node2 28983 1726883049.68185: Calling groups_inventory to load vars for managed_node2 28983 1726883049.68188: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883049.68198: Calling all_plugins_play to load vars for managed_node2 28983 1726883049.68201: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883049.68204: Calling groups_plugins_play to load vars for managed_node2 28983 1726883049.70366: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883049.73377: done with get_vars() 28983 1726883049.73412: done getting variables 28983 1726883049.73500: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:44:09 -0400 (0:00:00.072) 0:01:19.733 ****** 28983 1726883049.73548: entering _queue_task() for managed_node2/debug 28983 1726883049.74677: worker is 1 (out of 1 available) 28983 1726883049.74696: exiting _queue_task() for managed_node2/debug 28983 1726883049.74707: done queuing things up, now waiting for results queue to drain 28983 1726883049.74709: waiting for pending results... 28983 1726883049.76185: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 28983 1726883049.77439: in run() - task 0affe814-3a2d-b16d-c0a7-00000000128c 28983 1726883049.78066: variable 'ansible_search_path' from source: unknown 28983 1726883049.78070: variable 'ansible_search_path' from source: unknown 28983 1726883049.78075: calling self._execute() 28983 1726883049.78323: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883049.78337: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883049.78371: variable 'omit' from source: magic vars 28983 1726883049.79237: variable 'ansible_distribution_major_version' from source: facts 28983 1726883049.79362: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883049.79366: variable 'omit' from source: magic vars 28983 1726883049.79670: variable 'omit' from source: magic vars 28983 1726883049.79852: variable 'omit' from source: magic vars 28983 1726883049.79856: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883049.79957: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883049.79965: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883049.79970: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883049.79975: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883049.80025: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883049.80036: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883049.80040: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883049.80147: Set connection var ansible_connection to ssh 28983 1726883049.80175: Set connection var ansible_shell_executable to /bin/sh 28983 1726883049.80199: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883049.80203: Set connection var ansible_timeout to 10 28983 1726883049.80205: Set connection var ansible_pipelining to False 28983 1726883049.80208: Set connection var ansible_shell_type to sh 28983 1726883049.80238: variable 'ansible_shell_executable' from source: unknown 28983 1726883049.80244: variable 'ansible_connection' from source: unknown 28983 1726883049.80247: variable 'ansible_module_compression' from source: unknown 28983 1726883049.80250: variable 'ansible_shell_type' from source: unknown 28983 1726883049.80253: variable 'ansible_shell_executable' from source: unknown 28983 1726883049.80256: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883049.80282: variable 'ansible_pipelining' from source: unknown 28983 1726883049.80285: variable 'ansible_timeout' from source: unknown 28983 1726883049.80288: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883049.80484: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883049.80550: variable 'omit' from source: magic vars 28983 1726883049.80554: starting attempt loop 28983 1726883049.80557: running the handler 28983 1726883049.80689: variable '__network_connections_result' from source: set_fact 28983 1726883049.80745: handler run complete 28983 1726883049.80765: attempt loop complete, returning result 28983 1726883049.80768: _execute() done 28983 1726883049.80772: dumping result to json 28983 1726883049.80777: done dumping result, returning 28983 1726883049.80842: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affe814-3a2d-b16d-c0a7-00000000128c] 28983 1726883049.80847: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000128c 28983 1726883049.80923: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000128c 28983 1726883049.80926: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 3ac79eb6-77ee-484f-9752-0ce3ea88e423 skipped because already active" ] } 28983 1726883049.81036: no more pending results, returning what we have 28983 1726883049.81041: results queue empty 28983 1726883049.81043: checking for any_errors_fatal 28983 1726883049.81052: done checking for any_errors_fatal 28983 1726883049.81053: checking for max_fail_percentage 28983 1726883049.81056: done checking for max_fail_percentage 28983 1726883049.81059: checking to see if all hosts have failed and the running result is not ok 28983 1726883049.81060: done checking to see if all hosts have failed 28983 1726883049.81061: getting the remaining hosts for this loop 28983 1726883049.81064: done getting the remaining hosts for this loop 28983 1726883049.81070: getting the next task for host managed_node2 28983 1726883049.81083: done getting next task for host managed_node2 28983 1726883049.81089: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 28983 1726883049.81098: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883049.81120: getting variables 28983 1726883049.81122: in VariableManager get_vars() 28983 1726883049.81374: Calling all_inventory to load vars for managed_node2 28983 1726883049.81378: Calling groups_inventory to load vars for managed_node2 28983 1726883049.81381: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883049.81392: Calling all_plugins_play to load vars for managed_node2 28983 1726883049.81396: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883049.81401: Calling groups_plugins_play to load vars for managed_node2 28983 1726883049.85487: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883049.89708: done with get_vars() 28983 1726883049.89759: done getting variables 28983 1726883049.89840: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:44:09 -0400 (0:00:00.163) 0:01:19.896 ****** 28983 1726883049.89892: entering _queue_task() for managed_node2/debug 28983 1726883049.90272: worker is 1 (out of 1 available) 28983 1726883049.90286: exiting _queue_task() for managed_node2/debug 28983 1726883049.90299: done queuing things up, now waiting for results queue to drain 28983 1726883049.90301: waiting for pending results... 28983 1726883049.90885: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 28983 1726883049.90892: in run() - task 0affe814-3a2d-b16d-c0a7-00000000128d 28983 1726883049.90896: variable 'ansible_search_path' from source: unknown 28983 1726883049.90899: variable 'ansible_search_path' from source: unknown 28983 1726883049.90901: calling self._execute() 28983 1726883049.91145: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883049.91153: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883049.91164: variable 'omit' from source: magic vars 28983 1726883049.92126: variable 'ansible_distribution_major_version' from source: facts 28983 1726883049.92341: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883049.92344: variable 'omit' from source: magic vars 28983 1726883049.92412: variable 'omit' from source: magic vars 28983 1726883049.92506: variable 'omit' from source: magic vars 28983 1726883049.92558: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883049.92712: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883049.92739: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883049.92762: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883049.92777: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883049.93033: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883049.93038: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883049.93041: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883049.93256: Set connection var ansible_connection to ssh 28983 1726883049.93260: Set connection var ansible_shell_executable to /bin/sh 28983 1726883049.93262: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883049.93265: Set connection var ansible_timeout to 10 28983 1726883049.93267: Set connection var ansible_pipelining to False 28983 1726883049.93269: Set connection var ansible_shell_type to sh 28983 1726883049.93355: variable 'ansible_shell_executable' from source: unknown 28983 1726883049.93360: variable 'ansible_connection' from source: unknown 28983 1726883049.93363: variable 'ansible_module_compression' from source: unknown 28983 1726883049.93371: variable 'ansible_shell_type' from source: unknown 28983 1726883049.93376: variable 'ansible_shell_executable' from source: unknown 28983 1726883049.93379: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883049.93381: variable 'ansible_pipelining' from source: unknown 28983 1726883049.93383: variable 'ansible_timeout' from source: unknown 28983 1726883049.93389: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883049.93780: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883049.93800: variable 'omit' from source: magic vars 28983 1726883049.93901: starting attempt loop 28983 1726883049.93904: running the handler 28983 1726883049.93907: variable '__network_connections_result' from source: set_fact 28983 1726883049.94211: variable '__network_connections_result' from source: set_fact 28983 1726883049.94562: handler run complete 28983 1726883049.94566: attempt loop complete, returning result 28983 1726883049.94568: _execute() done 28983 1726883049.94571: dumping result to json 28983 1726883049.94577: done dumping result, returning 28983 1726883049.94580: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affe814-3a2d-b16d-c0a7-00000000128d] 28983 1726883049.94583: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000128d 28983 1726883049.95015: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000128d 28983 1726883049.95107: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "state": "up" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": false, "failed": false, "stderr": "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 3ac79eb6-77ee-484f-9752-0ce3ea88e423 skipped because already active\n", "stderr_lines": [ "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 3ac79eb6-77ee-484f-9752-0ce3ea88e423 skipped because already active" ] } } 28983 1726883049.95236: no more pending results, returning what we have 28983 1726883049.95241: results queue empty 28983 1726883049.95242: checking for any_errors_fatal 28983 1726883049.95251: done checking for any_errors_fatal 28983 1726883049.95252: checking for max_fail_percentage 28983 1726883049.95255: done checking for max_fail_percentage 28983 1726883049.95256: checking to see if all hosts have failed and the running result is not ok 28983 1726883049.95257: done checking to see if all hosts have failed 28983 1726883049.95260: getting the remaining hosts for this loop 28983 1726883049.95263: done getting the remaining hosts for this loop 28983 1726883049.95268: getting the next task for host managed_node2 28983 1726883049.95279: done getting next task for host managed_node2 28983 1726883049.95285: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 28983 1726883049.95291: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883049.95308: getting variables 28983 1726883049.95310: in VariableManager get_vars() 28983 1726883049.95518: Calling all_inventory to load vars for managed_node2 28983 1726883049.95522: Calling groups_inventory to load vars for managed_node2 28983 1726883049.95532: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883049.95547: Calling all_plugins_play to load vars for managed_node2 28983 1726883049.95551: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883049.95555: Calling groups_plugins_play to load vars for managed_node2 28983 1726883049.98952: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883050.03479: done with get_vars() 28983 1726883050.03528: done getting variables 28983 1726883050.03614: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:44:10 -0400 (0:00:00.138) 0:01:20.034 ****** 28983 1726883050.03705: entering _queue_task() for managed_node2/debug 28983 1726883050.04695: worker is 1 (out of 1 available) 28983 1726883050.04709: exiting _queue_task() for managed_node2/debug 28983 1726883050.04722: done queuing things up, now waiting for results queue to drain 28983 1726883050.04723: waiting for pending results... 28983 1726883050.05727: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 28983 1726883050.06132: in run() - task 0affe814-3a2d-b16d-c0a7-00000000128e 28983 1726883050.06149: variable 'ansible_search_path' from source: unknown 28983 1726883050.06153: variable 'ansible_search_path' from source: unknown 28983 1726883050.06201: calling self._execute() 28983 1726883050.06480: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883050.06489: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883050.06500: variable 'omit' from source: magic vars 28983 1726883050.07443: variable 'ansible_distribution_major_version' from source: facts 28983 1726883050.07456: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883050.07782: variable 'network_state' from source: role '' defaults 28983 1726883050.07909: Evaluated conditional (network_state != {}): False 28983 1726883050.07912: when evaluation is False, skipping this task 28983 1726883050.07916: _execute() done 28983 1726883050.07918: dumping result to json 28983 1726883050.07924: done dumping result, returning 28983 1726883050.07935: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affe814-3a2d-b16d-c0a7-00000000128e] 28983 1726883050.07943: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000128e skipping: [managed_node2] => { "false_condition": "network_state != {}" } 28983 1726883050.08333: no more pending results, returning what we have 28983 1726883050.08339: results queue empty 28983 1726883050.08340: checking for any_errors_fatal 28983 1726883050.08350: done checking for any_errors_fatal 28983 1726883050.08351: checking for max_fail_percentage 28983 1726883050.08354: done checking for max_fail_percentage 28983 1726883050.08355: checking to see if all hosts have failed and the running result is not ok 28983 1726883050.08356: done checking to see if all hosts have failed 28983 1726883050.08357: getting the remaining hosts for this loop 28983 1726883050.08360: done getting the remaining hosts for this loop 28983 1726883050.08364: getting the next task for host managed_node2 28983 1726883050.08376: done getting next task for host managed_node2 28983 1726883050.08381: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 28983 1726883050.08388: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883050.08419: getting variables 28983 1726883050.08421: in VariableManager get_vars() 28983 1726883050.08467: Calling all_inventory to load vars for managed_node2 28983 1726883050.08471: Calling groups_inventory to load vars for managed_node2 28983 1726883050.08476: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883050.08487: Calling all_plugins_play to load vars for managed_node2 28983 1726883050.08490: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883050.08493: Calling groups_plugins_play to load vars for managed_node2 28983 1726883050.09278: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000128e 28983 1726883050.09282: WORKER PROCESS EXITING 28983 1726883050.14799: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883050.23617: done with get_vars() 28983 1726883050.23810: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:44:10 -0400 (0:00:00.206) 0:01:20.241 ****** 28983 1726883050.24357: entering _queue_task() for managed_node2/ping 28983 1726883050.25608: worker is 1 (out of 1 available) 28983 1726883050.25621: exiting _queue_task() for managed_node2/ping 28983 1726883050.25736: done queuing things up, now waiting for results queue to drain 28983 1726883050.25739: waiting for pending results... 28983 1726883050.26088: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 28983 1726883050.26458: in run() - task 0affe814-3a2d-b16d-c0a7-00000000128f 28983 1726883050.26591: variable 'ansible_search_path' from source: unknown 28983 1726883050.26595: variable 'ansible_search_path' from source: unknown 28983 1726883050.26636: calling self._execute() 28983 1726883050.27110: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883050.27121: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883050.27135: variable 'omit' from source: magic vars 28983 1726883050.28243: variable 'ansible_distribution_major_version' from source: facts 28983 1726883050.28258: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883050.28308: variable 'omit' from source: magic vars 28983 1726883050.28756: variable 'omit' from source: magic vars 28983 1726883050.29061: variable 'omit' from source: magic vars 28983 1726883050.29064: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883050.29067: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883050.29070: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883050.29175: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883050.29180: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883050.29251: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883050.29255: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883050.29257: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883050.29569: Set connection var ansible_connection to ssh 28983 1726883050.29587: Set connection var ansible_shell_executable to /bin/sh 28983 1726883050.29657: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883050.29665: Set connection var ansible_timeout to 10 28983 1726883050.29668: Set connection var ansible_pipelining to False 28983 1726883050.29670: Set connection var ansible_shell_type to sh 28983 1726883050.29673: variable 'ansible_shell_executable' from source: unknown 28983 1726883050.29675: variable 'ansible_connection' from source: unknown 28983 1726883050.29678: variable 'ansible_module_compression' from source: unknown 28983 1726883050.29680: variable 'ansible_shell_type' from source: unknown 28983 1726883050.29683: variable 'ansible_shell_executable' from source: unknown 28983 1726883050.29686: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883050.29716: variable 'ansible_pipelining' from source: unknown 28983 1726883050.29792: variable 'ansible_timeout' from source: unknown 28983 1726883050.29796: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883050.30407: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28983 1726883050.30412: variable 'omit' from source: magic vars 28983 1726883050.30415: starting attempt loop 28983 1726883050.30417: running the handler 28983 1726883050.30419: _low_level_execute_command(): starting 28983 1726883050.30422: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726883050.32357: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883050.32527: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883050.32583: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883050.32587: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883050.32687: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883050.34636: stdout chunk (state=3): >>>/root <<< 28983 1726883050.34667: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883050.34674: stdout chunk (state=3): >>><<< 28983 1726883050.34686: stderr chunk (state=3): >>><<< 28983 1726883050.34710: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883050.34726: _low_level_execute_command(): starting 28983 1726883050.34735: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726883050.3471181-31912-91901670251667 `" && echo ansible-tmp-1726883050.3471181-31912-91901670251667="` echo /root/.ansible/tmp/ansible-tmp-1726883050.3471181-31912-91901670251667 `" ) && sleep 0' 28983 1726883050.36325: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883050.36329: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883050.36572: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883050.36576: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883050.36596: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883050.36810: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883050.36814: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883050.37172: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883050.39165: stdout chunk (state=3): >>>ansible-tmp-1726883050.3471181-31912-91901670251667=/root/.ansible/tmp/ansible-tmp-1726883050.3471181-31912-91901670251667 <<< 28983 1726883050.39371: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883050.39378: stderr chunk (state=3): >>><<< 28983 1726883050.39380: stdout chunk (state=3): >>><<< 28983 1726883050.39384: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726883050.3471181-31912-91901670251667=/root/.ansible/tmp/ansible-tmp-1726883050.3471181-31912-91901670251667 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883050.39425: variable 'ansible_module_compression' from source: unknown 28983 1726883050.39474: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 28983 1726883050.39504: variable 'ansible_facts' from source: unknown 28983 1726883050.39851: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726883050.3471181-31912-91901670251667/AnsiballZ_ping.py 28983 1726883050.40014: Sending initial data 28983 1726883050.40026: Sent initial data (152 bytes) 28983 1726883050.41396: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883050.41451: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883050.41898: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883050.43326: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726883050.43393: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726883050.43498: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmptcmmr4h0 /root/.ansible/tmp/ansible-tmp-1726883050.3471181-31912-91901670251667/AnsiballZ_ping.py <<< 28983 1726883050.43501: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726883050.3471181-31912-91901670251667/AnsiballZ_ping.py" <<< 28983 1726883050.43844: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmptcmmr4h0" to remote "/root/.ansible/tmp/ansible-tmp-1726883050.3471181-31912-91901670251667/AnsiballZ_ping.py" <<< 28983 1726883050.43867: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726883050.3471181-31912-91901670251667/AnsiballZ_ping.py" <<< 28983 1726883050.45547: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883050.45553: stdout chunk (state=3): >>><<< 28983 1726883050.45562: stderr chunk (state=3): >>><<< 28983 1726883050.45585: done transferring module to remote 28983 1726883050.45599: _low_level_execute_command(): starting 28983 1726883050.45605: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726883050.3471181-31912-91901670251667/ /root/.ansible/tmp/ansible-tmp-1726883050.3471181-31912-91901670251667/AnsiballZ_ping.py && sleep 0' 28983 1726883050.47084: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883050.47094: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883050.47105: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883050.47128: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883050.47137: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726883050.47152: stderr chunk (state=3): >>>debug2: match not found <<< 28983 1726883050.47167: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883050.47182: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28983 1726883050.47192: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.46.139 is address <<< 28983 1726883050.47200: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28983 1726883050.47209: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883050.47219: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883050.47236: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883050.47248: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726883050.47255: stderr chunk (state=3): >>>debug2: match found <<< 28983 1726883050.47341: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883050.47353: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883050.47359: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883050.47375: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883050.47479: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883050.50380: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883050.50384: stdout chunk (state=3): >>><<< 28983 1726883050.50387: stderr chunk (state=3): >>><<< 28983 1726883050.50389: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883050.50392: _low_level_execute_command(): starting 28983 1726883050.50394: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726883050.3471181-31912-91901670251667/AnsiballZ_ping.py && sleep 0' 28983 1726883050.51347: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883050.51357: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883050.51369: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883050.51386: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883050.51411: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726883050.51419: stderr chunk (state=3): >>>debug2: match not found <<< 28983 1726883050.51451: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726883050.51516: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883050.51566: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883050.51666: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883050.68691: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 28983 1726883050.70158: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. <<< 28983 1726883050.70217: stderr chunk (state=3): >>><<< 28983 1726883050.70220: stdout chunk (state=3): >>><<< 28983 1726883050.70252: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726883050.70311: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726883050.3471181-31912-91901670251667/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726883050.70316: _low_level_execute_command(): starting 28983 1726883050.70338: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726883050.3471181-31912-91901670251667/ > /dev/null 2>&1 && sleep 0' 28983 1726883050.71094: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883050.71103: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883050.71129: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883050.71154: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883050.71285: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883050.73214: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883050.73270: stderr chunk (state=3): >>><<< 28983 1726883050.73274: stdout chunk (state=3): >>><<< 28983 1726883050.73293: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883050.73309: handler run complete 28983 1726883050.73335: attempt loop complete, returning result 28983 1726883050.73340: _execute() done 28983 1726883050.73343: dumping result to json 28983 1726883050.73355: done dumping result, returning 28983 1726883050.73371: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affe814-3a2d-b16d-c0a7-00000000128f] 28983 1726883050.73374: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000128f 28983 1726883050.73515: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000128f 28983 1726883050.73519: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "ping": "pong" } 28983 1726883050.73615: no more pending results, returning what we have 28983 1726883050.73619: results queue empty 28983 1726883050.73620: checking for any_errors_fatal 28983 1726883050.73626: done checking for any_errors_fatal 28983 1726883050.73627: checking for max_fail_percentage 28983 1726883050.73629: done checking for max_fail_percentage 28983 1726883050.73630: checking to see if all hosts have failed and the running result is not ok 28983 1726883050.73631: done checking to see if all hosts have failed 28983 1726883050.73634: getting the remaining hosts for this loop 28983 1726883050.73636: done getting the remaining hosts for this loop 28983 1726883050.73641: getting the next task for host managed_node2 28983 1726883050.73655: done getting next task for host managed_node2 28983 1726883050.73658: ^ task is: TASK: meta (role_complete) 28983 1726883050.73664: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883050.73680: getting variables 28983 1726883050.73682: in VariableManager get_vars() 28983 1726883050.73874: Calling all_inventory to load vars for managed_node2 28983 1726883050.73877: Calling groups_inventory to load vars for managed_node2 28983 1726883050.73880: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883050.73889: Calling all_plugins_play to load vars for managed_node2 28983 1726883050.73891: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883050.73894: Calling groups_plugins_play to load vars for managed_node2 28983 1726883050.75178: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883050.76988: done with get_vars() 28983 1726883050.77045: done getting variables 28983 1726883050.77161: done queuing things up, now waiting for results queue to drain 28983 1726883050.77164: results queue empty 28983 1726883050.77165: checking for any_errors_fatal 28983 1726883050.77168: done checking for any_errors_fatal 28983 1726883050.77169: checking for max_fail_percentage 28983 1726883050.77171: done checking for max_fail_percentage 28983 1726883050.77172: checking to see if all hosts have failed and the running result is not ok 28983 1726883050.77176: done checking to see if all hosts have failed 28983 1726883050.77177: getting the remaining hosts for this loop 28983 1726883050.77179: done getting the remaining hosts for this loop 28983 1726883050.77182: getting the next task for host managed_node2 28983 1726883050.77189: done getting next task for host managed_node2 28983 1726883050.77192: ^ task is: TASK: Test 28983 1726883050.77195: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883050.77200: getting variables 28983 1726883050.77201: in VariableManager get_vars() 28983 1726883050.77220: Calling all_inventory to load vars for managed_node2 28983 1726883050.77223: Calling groups_inventory to load vars for managed_node2 28983 1726883050.77227: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883050.77233: Calling all_plugins_play to load vars for managed_node2 28983 1726883050.77237: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883050.77240: Calling groups_plugins_play to load vars for managed_node2 28983 1726883050.80109: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883050.83584: done with get_vars() 28983 1726883050.83621: done getting variables TASK [Test] ******************************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:30 Friday 20 September 2024 21:44:10 -0400 (0:00:00.593) 0:01:20.834 ****** 28983 1726883050.83692: entering _queue_task() for managed_node2/include_tasks 28983 1726883050.84098: worker is 1 (out of 1 available) 28983 1726883050.84111: exiting _queue_task() for managed_node2/include_tasks 28983 1726883050.84123: done queuing things up, now waiting for results queue to drain 28983 1726883050.84125: waiting for pending results... 28983 1726883050.84342: running TaskExecutor() for managed_node2/TASK: Test 28983 1726883050.84447: in run() - task 0affe814-3a2d-b16d-c0a7-000000001009 28983 1726883050.84470: variable 'ansible_search_path' from source: unknown 28983 1726883050.84474: variable 'ansible_search_path' from source: unknown 28983 1726883050.84544: variable 'lsr_test' from source: include params 28983 1726883050.84748: variable 'lsr_test' from source: include params 28983 1726883050.84834: variable 'omit' from source: magic vars 28983 1726883050.84985: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883050.84994: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883050.85005: variable 'omit' from source: magic vars 28983 1726883050.85320: variable 'ansible_distribution_major_version' from source: facts 28983 1726883050.85326: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883050.85337: variable 'item' from source: unknown 28983 1726883050.85415: variable 'item' from source: unknown 28983 1726883050.85449: variable 'item' from source: unknown 28983 1726883050.85510: variable 'item' from source: unknown 28983 1726883050.85678: dumping result to json 28983 1726883050.85682: done dumping result, returning 28983 1726883050.85685: done running TaskExecutor() for managed_node2/TASK: Test [0affe814-3a2d-b16d-c0a7-000000001009] 28983 1726883050.85688: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001009 28983 1726883050.85797: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001009 28983 1726883050.85824: no more pending results, returning what we have 28983 1726883050.85828: in VariableManager get_vars() 28983 1726883050.85907: Calling all_inventory to load vars for managed_node2 28983 1726883050.85910: Calling groups_inventory to load vars for managed_node2 28983 1726883050.85947: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883050.85964: Calling all_plugins_play to load vars for managed_node2 28983 1726883050.85969: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883050.85975: Calling groups_plugins_play to load vars for managed_node2 28983 1726883050.86532: WORKER PROCESS EXITING 28983 1726883050.87500: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883050.89892: done with get_vars() 28983 1726883050.89923: variable 'ansible_search_path' from source: unknown 28983 1726883050.89924: variable 'ansible_search_path' from source: unknown 28983 1726883050.89971: we have included files to process 28983 1726883050.89975: generating all_blocks data 28983 1726883050.89977: done generating all_blocks data 28983 1726883050.89983: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_profile.yml 28983 1726883050.89984: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_profile.yml 28983 1726883050.89987: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_profile.yml 28983 1726883050.90205: done processing included file 28983 1726883050.90208: iterating over new_blocks loaded from include file 28983 1726883050.90210: in VariableManager get_vars() 28983 1726883050.90228: done with get_vars() 28983 1726883050.90230: filtering new block on tags 28983 1726883050.90264: done filtering new block on tags 28983 1726883050.90267: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_profile.yml for managed_node2 => (item=tasks/remove_profile.yml) 28983 1726883050.90275: extending task lists for all hosts with included blocks 28983 1726883050.91502: done extending task lists 28983 1726883050.91503: done processing included files 28983 1726883050.91504: results queue empty 28983 1726883050.91505: checking for any_errors_fatal 28983 1726883050.91507: done checking for any_errors_fatal 28983 1726883050.91508: checking for max_fail_percentage 28983 1726883050.91509: done checking for max_fail_percentage 28983 1726883050.91510: checking to see if all hosts have failed and the running result is not ok 28983 1726883050.91511: done checking to see if all hosts have failed 28983 1726883050.91512: getting the remaining hosts for this loop 28983 1726883050.91514: done getting the remaining hosts for this loop 28983 1726883050.91517: getting the next task for host managed_node2 28983 1726883050.91522: done getting next task for host managed_node2 28983 1726883050.91525: ^ task is: TASK: Include network role 28983 1726883050.91529: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883050.91532: getting variables 28983 1726883050.91535: in VariableManager get_vars() 28983 1726883050.91548: Calling all_inventory to load vars for managed_node2 28983 1726883050.91551: Calling groups_inventory to load vars for managed_node2 28983 1726883050.91554: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883050.91561: Calling all_plugins_play to load vars for managed_node2 28983 1726883050.91564: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883050.91567: Calling groups_plugins_play to load vars for managed_node2 28983 1726883050.92798: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883050.94742: done with get_vars() 28983 1726883050.94777: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_profile.yml:3 Friday 20 September 2024 21:44:10 -0400 (0:00:00.111) 0:01:20.946 ****** 28983 1726883050.94882: entering _queue_task() for managed_node2/include_role 28983 1726883050.95217: worker is 1 (out of 1 available) 28983 1726883050.95232: exiting _queue_task() for managed_node2/include_role 28983 1726883050.95249: done queuing things up, now waiting for results queue to drain 28983 1726883050.95251: waiting for pending results... 28983 1726883050.95543: running TaskExecutor() for managed_node2/TASK: Include network role 28983 1726883050.95745: in run() - task 0affe814-3a2d-b16d-c0a7-0000000013e8 28983 1726883050.95749: variable 'ansible_search_path' from source: unknown 28983 1726883050.95752: variable 'ansible_search_path' from source: unknown 28983 1726883050.95767: calling self._execute() 28983 1726883050.95884: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883050.95896: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883050.96074: variable 'omit' from source: magic vars 28983 1726883050.96469: variable 'ansible_distribution_major_version' from source: facts 28983 1726883050.96490: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883050.96510: _execute() done 28983 1726883050.96520: dumping result to json 28983 1726883050.96529: done dumping result, returning 28983 1726883050.96544: done running TaskExecutor() for managed_node2/TASK: Include network role [0affe814-3a2d-b16d-c0a7-0000000013e8] 28983 1726883050.96555: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000013e8 28983 1726883050.96722: no more pending results, returning what we have 28983 1726883050.96728: in VariableManager get_vars() 28983 1726883050.96777: Calling all_inventory to load vars for managed_node2 28983 1726883050.96780: Calling groups_inventory to load vars for managed_node2 28983 1726883050.96784: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883050.96797: Calling all_plugins_play to load vars for managed_node2 28983 1726883050.96801: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883050.96804: Calling groups_plugins_play to load vars for managed_node2 28983 1726883050.97369: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000013e8 28983 1726883050.97373: WORKER PROCESS EXITING 28983 1726883050.99293: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883051.01299: done with get_vars() 28983 1726883051.01321: variable 'ansible_search_path' from source: unknown 28983 1726883051.01323: variable 'ansible_search_path' from source: unknown 28983 1726883051.01436: variable 'omit' from source: magic vars 28983 1726883051.01469: variable 'omit' from source: magic vars 28983 1726883051.01481: variable 'omit' from source: magic vars 28983 1726883051.01484: we have included files to process 28983 1726883051.01485: generating all_blocks data 28983 1726883051.01486: done generating all_blocks data 28983 1726883051.01487: processing included file: fedora.linux_system_roles.network 28983 1726883051.01504: in VariableManager get_vars() 28983 1726883051.01518: done with get_vars() 28983 1726883051.01544: in VariableManager get_vars() 28983 1726883051.01558: done with get_vars() 28983 1726883051.01589: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 28983 1726883051.01692: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 28983 1726883051.01763: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 28983 1726883051.02226: in VariableManager get_vars() 28983 1726883051.02254: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 28983 1726883051.04245: iterating over new_blocks loaded from include file 28983 1726883051.04247: in VariableManager get_vars() 28983 1726883051.04261: done with get_vars() 28983 1726883051.04262: filtering new block on tags 28983 1726883051.04493: done filtering new block on tags 28983 1726883051.04496: in VariableManager get_vars() 28983 1726883051.04509: done with get_vars() 28983 1726883051.04511: filtering new block on tags 28983 1726883051.04525: done filtering new block on tags 28983 1726883051.04527: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node2 28983 1726883051.04532: extending task lists for all hosts with included blocks 28983 1726883051.04623: done extending task lists 28983 1726883051.04624: done processing included files 28983 1726883051.04624: results queue empty 28983 1726883051.04625: checking for any_errors_fatal 28983 1726883051.04629: done checking for any_errors_fatal 28983 1726883051.04629: checking for max_fail_percentage 28983 1726883051.04630: done checking for max_fail_percentage 28983 1726883051.04631: checking to see if all hosts have failed and the running result is not ok 28983 1726883051.04631: done checking to see if all hosts have failed 28983 1726883051.04632: getting the remaining hosts for this loop 28983 1726883051.04633: done getting the remaining hosts for this loop 28983 1726883051.04637: getting the next task for host managed_node2 28983 1726883051.04640: done getting next task for host managed_node2 28983 1726883051.04642: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 28983 1726883051.04645: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883051.04653: getting variables 28983 1726883051.04653: in VariableManager get_vars() 28983 1726883051.04663: Calling all_inventory to load vars for managed_node2 28983 1726883051.04665: Calling groups_inventory to load vars for managed_node2 28983 1726883051.04667: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883051.04671: Calling all_plugins_play to load vars for managed_node2 28983 1726883051.04673: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883051.04676: Calling groups_plugins_play to load vars for managed_node2 28983 1726883051.05945: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883051.08183: done with get_vars() 28983 1726883051.08208: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:44:11 -0400 (0:00:00.133) 0:01:21.080 ****** 28983 1726883051.08275: entering _queue_task() for managed_node2/include_tasks 28983 1726883051.08558: worker is 1 (out of 1 available) 28983 1726883051.08573: exiting _queue_task() for managed_node2/include_tasks 28983 1726883051.08589: done queuing things up, now waiting for results queue to drain 28983 1726883051.08591: waiting for pending results... 28983 1726883051.08799: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 28983 1726883051.08935: in run() - task 0affe814-3a2d-b16d-c0a7-00000000145f 28983 1726883051.08942: variable 'ansible_search_path' from source: unknown 28983 1726883051.08952: variable 'ansible_search_path' from source: unknown 28983 1726883051.08983: calling self._execute() 28983 1726883051.09076: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883051.09081: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883051.09089: variable 'omit' from source: magic vars 28983 1726883051.09454: variable 'ansible_distribution_major_version' from source: facts 28983 1726883051.09483: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883051.09487: _execute() done 28983 1726883051.09490: dumping result to json 28983 1726883051.09495: done dumping result, returning 28983 1726883051.09498: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affe814-3a2d-b16d-c0a7-00000000145f] 28983 1726883051.09500: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000145f 28983 1726883051.09639: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000145f 28983 1726883051.09698: no more pending results, returning what we have 28983 1726883051.09703: in VariableManager get_vars() 28983 1726883051.09759: Calling all_inventory to load vars for managed_node2 28983 1726883051.09763: Calling groups_inventory to load vars for managed_node2 28983 1726883051.09766: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883051.09780: Calling all_plugins_play to load vars for managed_node2 28983 1726883051.09784: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883051.09787: Calling groups_plugins_play to load vars for managed_node2 28983 1726883051.10351: WORKER PROCESS EXITING 28983 1726883051.11768: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883051.13758: done with get_vars() 28983 1726883051.13779: variable 'ansible_search_path' from source: unknown 28983 1726883051.13780: variable 'ansible_search_path' from source: unknown 28983 1726883051.13816: we have included files to process 28983 1726883051.13817: generating all_blocks data 28983 1726883051.13819: done generating all_blocks data 28983 1726883051.13821: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 28983 1726883051.13822: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 28983 1726883051.13824: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 28983 1726883051.14301: done processing included file 28983 1726883051.14303: iterating over new_blocks loaded from include file 28983 1726883051.14304: in VariableManager get_vars() 28983 1726883051.14323: done with get_vars() 28983 1726883051.14324: filtering new block on tags 28983 1726883051.14354: done filtering new block on tags 28983 1726883051.14356: in VariableManager get_vars() 28983 1726883051.14375: done with get_vars() 28983 1726883051.14376: filtering new block on tags 28983 1726883051.14411: done filtering new block on tags 28983 1726883051.14413: in VariableManager get_vars() 28983 1726883051.14430: done with get_vars() 28983 1726883051.14431: filtering new block on tags 28983 1726883051.14468: done filtering new block on tags 28983 1726883051.14470: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node2 28983 1726883051.14475: extending task lists for all hosts with included blocks 28983 1726883051.16383: done extending task lists 28983 1726883051.16384: done processing included files 28983 1726883051.16385: results queue empty 28983 1726883051.16385: checking for any_errors_fatal 28983 1726883051.16388: done checking for any_errors_fatal 28983 1726883051.16388: checking for max_fail_percentage 28983 1726883051.16389: done checking for max_fail_percentage 28983 1726883051.16390: checking to see if all hosts have failed and the running result is not ok 28983 1726883051.16391: done checking to see if all hosts have failed 28983 1726883051.16391: getting the remaining hosts for this loop 28983 1726883051.16392: done getting the remaining hosts for this loop 28983 1726883051.16395: getting the next task for host managed_node2 28983 1726883051.16399: done getting next task for host managed_node2 28983 1726883051.16401: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 28983 1726883051.16404: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883051.16412: getting variables 28983 1726883051.16413: in VariableManager get_vars() 28983 1726883051.16423: Calling all_inventory to load vars for managed_node2 28983 1726883051.16425: Calling groups_inventory to load vars for managed_node2 28983 1726883051.16426: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883051.16430: Calling all_plugins_play to load vars for managed_node2 28983 1726883051.16432: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883051.16436: Calling groups_plugins_play to load vars for managed_node2 28983 1726883051.17515: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883051.19649: done with get_vars() 28983 1726883051.19689: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:44:11 -0400 (0:00:00.115) 0:01:21.195 ****** 28983 1726883051.19802: entering _queue_task() for managed_node2/setup 28983 1726883051.20195: worker is 1 (out of 1 available) 28983 1726883051.20209: exiting _queue_task() for managed_node2/setup 28983 1726883051.20223: done queuing things up, now waiting for results queue to drain 28983 1726883051.20229: waiting for pending results... 28983 1726883051.20536: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 28983 1726883051.20790: in run() - task 0affe814-3a2d-b16d-c0a7-0000000014b6 28983 1726883051.20794: variable 'ansible_search_path' from source: unknown 28983 1726883051.20811: variable 'ansible_search_path' from source: unknown 28983 1726883051.20846: calling self._execute() 28983 1726883051.20977: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883051.20983: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883051.21011: variable 'omit' from source: magic vars 28983 1726883051.21534: variable 'ansible_distribution_major_version' from source: facts 28983 1726883051.21543: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883051.21842: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883051.24229: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883051.24290: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883051.24322: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883051.24357: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883051.24382: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883051.24452: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883051.24479: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883051.24504: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883051.24540: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883051.24553: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883051.24602: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883051.24623: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883051.24646: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883051.24681: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883051.24692: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883051.24823: variable '__network_required_facts' from source: role '' defaults 28983 1726883051.24831: variable 'ansible_facts' from source: unknown 28983 1726883051.25525: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 28983 1726883051.25529: when evaluation is False, skipping this task 28983 1726883051.25532: _execute() done 28983 1726883051.25536: dumping result to json 28983 1726883051.25560: done dumping result, returning 28983 1726883051.25564: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affe814-3a2d-b16d-c0a7-0000000014b6] 28983 1726883051.25566: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000014b6 28983 1726883051.25660: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000014b6 28983 1726883051.25662: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28983 1726883051.25715: no more pending results, returning what we have 28983 1726883051.25719: results queue empty 28983 1726883051.25720: checking for any_errors_fatal 28983 1726883051.25722: done checking for any_errors_fatal 28983 1726883051.25723: checking for max_fail_percentage 28983 1726883051.25726: done checking for max_fail_percentage 28983 1726883051.25727: checking to see if all hosts have failed and the running result is not ok 28983 1726883051.25728: done checking to see if all hosts have failed 28983 1726883051.25728: getting the remaining hosts for this loop 28983 1726883051.25731: done getting the remaining hosts for this loop 28983 1726883051.25737: getting the next task for host managed_node2 28983 1726883051.25749: done getting next task for host managed_node2 28983 1726883051.25754: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 28983 1726883051.25760: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883051.25789: getting variables 28983 1726883051.25791: in VariableManager get_vars() 28983 1726883051.25833: Calling all_inventory to load vars for managed_node2 28983 1726883051.25847: Calling groups_inventory to load vars for managed_node2 28983 1726883051.25850: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883051.25860: Calling all_plugins_play to load vars for managed_node2 28983 1726883051.25863: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883051.25881: Calling groups_plugins_play to load vars for managed_node2 28983 1726883051.27594: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883051.29990: done with get_vars() 28983 1726883051.30024: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:44:11 -0400 (0:00:00.103) 0:01:21.298 ****** 28983 1726883051.30110: entering _queue_task() for managed_node2/stat 28983 1726883051.30399: worker is 1 (out of 1 available) 28983 1726883051.30414: exiting _queue_task() for managed_node2/stat 28983 1726883051.30428: done queuing things up, now waiting for results queue to drain 28983 1726883051.30430: waiting for pending results... 28983 1726883051.30655: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 28983 1726883051.30784: in run() - task 0affe814-3a2d-b16d-c0a7-0000000014b8 28983 1726883051.30800: variable 'ansible_search_path' from source: unknown 28983 1726883051.30804: variable 'ansible_search_path' from source: unknown 28983 1726883051.30840: calling self._execute() 28983 1726883051.30930: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883051.30937: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883051.30948: variable 'omit' from source: magic vars 28983 1726883051.31291: variable 'ansible_distribution_major_version' from source: facts 28983 1726883051.31302: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883051.31452: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883051.31679: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883051.31716: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883051.31746: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883051.31782: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883051.31851: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883051.31873: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883051.31938: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883051.31968: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883051.32071: variable '__network_is_ostree' from source: set_fact 28983 1726883051.32078: Evaluated conditional (not __network_is_ostree is defined): False 28983 1726883051.32082: when evaluation is False, skipping this task 28983 1726883051.32085: _execute() done 28983 1726883051.32091: dumping result to json 28983 1726883051.32094: done dumping result, returning 28983 1726883051.32105: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affe814-3a2d-b16d-c0a7-0000000014b8] 28983 1726883051.32111: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000014b8 28983 1726883051.32243: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000014b8 28983 1726883051.32250: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 28983 1726883051.32328: no more pending results, returning what we have 28983 1726883051.32333: results queue empty 28983 1726883051.32336: checking for any_errors_fatal 28983 1726883051.32343: done checking for any_errors_fatal 28983 1726883051.32344: checking for max_fail_percentage 28983 1726883051.32346: done checking for max_fail_percentage 28983 1726883051.32347: checking to see if all hosts have failed and the running result is not ok 28983 1726883051.32348: done checking to see if all hosts have failed 28983 1726883051.32349: getting the remaining hosts for this loop 28983 1726883051.32350: done getting the remaining hosts for this loop 28983 1726883051.32354: getting the next task for host managed_node2 28983 1726883051.32361: done getting next task for host managed_node2 28983 1726883051.32366: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 28983 1726883051.32371: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883051.32393: getting variables 28983 1726883051.32394: in VariableManager get_vars() 28983 1726883051.32443: Calling all_inventory to load vars for managed_node2 28983 1726883051.32447: Calling groups_inventory to load vars for managed_node2 28983 1726883051.32450: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883051.32459: Calling all_plugins_play to load vars for managed_node2 28983 1726883051.32462: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883051.32465: Calling groups_plugins_play to load vars for managed_node2 28983 1726883051.33855: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883051.35838: done with get_vars() 28983 1726883051.35877: done getting variables 28983 1726883051.35949: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:44:11 -0400 (0:00:00.058) 0:01:21.357 ****** 28983 1726883051.35996: entering _queue_task() for managed_node2/set_fact 28983 1726883051.36262: worker is 1 (out of 1 available) 28983 1726883051.36276: exiting _queue_task() for managed_node2/set_fact 28983 1726883051.36289: done queuing things up, now waiting for results queue to drain 28983 1726883051.36291: waiting for pending results... 28983 1726883051.36560: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 28983 1726883051.36732: in run() - task 0affe814-3a2d-b16d-c0a7-0000000014b9 28983 1726883051.36760: variable 'ansible_search_path' from source: unknown 28983 1726883051.36764: variable 'ansible_search_path' from source: unknown 28983 1726883051.36820: calling self._execute() 28983 1726883051.36921: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883051.36925: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883051.36952: variable 'omit' from source: magic vars 28983 1726883051.37361: variable 'ansible_distribution_major_version' from source: facts 28983 1726883051.37374: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883051.37527: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883051.37795: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883051.37831: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883051.37861: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883051.37896: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883051.37966: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883051.37992: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883051.38017: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883051.38040: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883051.38113: variable '__network_is_ostree' from source: set_fact 28983 1726883051.38122: Evaluated conditional (not __network_is_ostree is defined): False 28983 1726883051.38125: when evaluation is False, skipping this task 28983 1726883051.38128: _execute() done 28983 1726883051.38130: dumping result to json 28983 1726883051.38137: done dumping result, returning 28983 1726883051.38145: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affe814-3a2d-b16d-c0a7-0000000014b9] 28983 1726883051.38150: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000014b9 28983 1726883051.38243: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000014b9 28983 1726883051.38246: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 28983 1726883051.38296: no more pending results, returning what we have 28983 1726883051.38300: results queue empty 28983 1726883051.38301: checking for any_errors_fatal 28983 1726883051.38307: done checking for any_errors_fatal 28983 1726883051.38308: checking for max_fail_percentage 28983 1726883051.38310: done checking for max_fail_percentage 28983 1726883051.38311: checking to see if all hosts have failed and the running result is not ok 28983 1726883051.38312: done checking to see if all hosts have failed 28983 1726883051.38313: getting the remaining hosts for this loop 28983 1726883051.38314: done getting the remaining hosts for this loop 28983 1726883051.38318: getting the next task for host managed_node2 28983 1726883051.38329: done getting next task for host managed_node2 28983 1726883051.38335: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 28983 1726883051.38341: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883051.38361: getting variables 28983 1726883051.38363: in VariableManager get_vars() 28983 1726883051.38398: Calling all_inventory to load vars for managed_node2 28983 1726883051.38402: Calling groups_inventory to load vars for managed_node2 28983 1726883051.38404: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883051.38412: Calling all_plugins_play to load vars for managed_node2 28983 1726883051.38415: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883051.38423: Calling groups_plugins_play to load vars for managed_node2 28983 1726883051.39647: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883051.41266: done with get_vars() 28983 1726883051.41290: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:44:11 -0400 (0:00:00.053) 0:01:21.411 ****** 28983 1726883051.41367: entering _queue_task() for managed_node2/service_facts 28983 1726883051.41575: worker is 1 (out of 1 available) 28983 1726883051.41588: exiting _queue_task() for managed_node2/service_facts 28983 1726883051.41602: done queuing things up, now waiting for results queue to drain 28983 1726883051.41604: waiting for pending results... 28983 1726883051.41807: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running 28983 1726883051.41923: in run() - task 0affe814-3a2d-b16d-c0a7-0000000014bb 28983 1726883051.41938: variable 'ansible_search_path' from source: unknown 28983 1726883051.41943: variable 'ansible_search_path' from source: unknown 28983 1726883051.41980: calling self._execute() 28983 1726883051.42065: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883051.42075: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883051.42084: variable 'omit' from source: magic vars 28983 1726883051.42410: variable 'ansible_distribution_major_version' from source: facts 28983 1726883051.42421: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883051.42428: variable 'omit' from source: magic vars 28983 1726883051.42490: variable 'omit' from source: magic vars 28983 1726883051.42524: variable 'omit' from source: magic vars 28983 1726883051.42560: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883051.42592: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883051.42616: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883051.42632: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883051.42643: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883051.42674: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883051.42678: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883051.42680: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883051.42764: Set connection var ansible_connection to ssh 28983 1726883051.42776: Set connection var ansible_shell_executable to /bin/sh 28983 1726883051.42784: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883051.42792: Set connection var ansible_timeout to 10 28983 1726883051.42799: Set connection var ansible_pipelining to False 28983 1726883051.42801: Set connection var ansible_shell_type to sh 28983 1726883051.42824: variable 'ansible_shell_executable' from source: unknown 28983 1726883051.42827: variable 'ansible_connection' from source: unknown 28983 1726883051.42831: variable 'ansible_module_compression' from source: unknown 28983 1726883051.42834: variable 'ansible_shell_type' from source: unknown 28983 1726883051.42837: variable 'ansible_shell_executable' from source: unknown 28983 1726883051.42850: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883051.42853: variable 'ansible_pipelining' from source: unknown 28983 1726883051.42855: variable 'ansible_timeout' from source: unknown 28983 1726883051.42858: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883051.43019: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28983 1726883051.43030: variable 'omit' from source: magic vars 28983 1726883051.43034: starting attempt loop 28983 1726883051.43041: running the handler 28983 1726883051.43053: _low_level_execute_command(): starting 28983 1726883051.43063: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726883051.43619: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883051.43623: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883051.43627: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883051.43629: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883051.43672: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883051.43676: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883051.43764: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883051.45529: stdout chunk (state=3): >>>/root <<< 28983 1726883051.45638: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883051.45690: stderr chunk (state=3): >>><<< 28983 1726883051.45694: stdout chunk (state=3): >>><<< 28983 1726883051.45714: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883051.45724: _low_level_execute_command(): starting 28983 1726883051.45730: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726883051.457133-31965-49739636504835 `" && echo ansible-tmp-1726883051.457133-31965-49739636504835="` echo /root/.ansible/tmp/ansible-tmp-1726883051.457133-31965-49739636504835 `" ) && sleep 0' 28983 1726883051.46187: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883051.46193: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883051.46195: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration <<< 28983 1726883051.46207: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883051.46256: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883051.46262: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883051.46337: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883051.48369: stdout chunk (state=3): >>>ansible-tmp-1726883051.457133-31965-49739636504835=/root/.ansible/tmp/ansible-tmp-1726883051.457133-31965-49739636504835 <<< 28983 1726883051.48498: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883051.48533: stderr chunk (state=3): >>><<< 28983 1726883051.48538: stdout chunk (state=3): >>><<< 28983 1726883051.48551: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726883051.457133-31965-49739636504835=/root/.ansible/tmp/ansible-tmp-1726883051.457133-31965-49739636504835 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883051.48587: variable 'ansible_module_compression' from source: unknown 28983 1726883051.48624: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 28983 1726883051.48654: variable 'ansible_facts' from source: unknown 28983 1726883051.48717: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726883051.457133-31965-49739636504835/AnsiballZ_service_facts.py 28983 1726883051.48824: Sending initial data 28983 1726883051.48828: Sent initial data (160 bytes) 28983 1726883051.49293: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883051.49296: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883051.49300: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found <<< 28983 1726883051.49303: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883051.49344: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883051.49348: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883051.49423: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883051.51069: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 28983 1726883051.51075: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726883051.51139: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726883051.51208: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpgrzoay97 /root/.ansible/tmp/ansible-tmp-1726883051.457133-31965-49739636504835/AnsiballZ_service_facts.py <<< 28983 1726883051.51216: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726883051.457133-31965-49739636504835/AnsiballZ_service_facts.py" <<< 28983 1726883051.51280: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpgrzoay97" to remote "/root/.ansible/tmp/ansible-tmp-1726883051.457133-31965-49739636504835/AnsiballZ_service_facts.py" <<< 28983 1726883051.51283: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726883051.457133-31965-49739636504835/AnsiballZ_service_facts.py" <<< 28983 1726883051.52215: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883051.52268: stderr chunk (state=3): >>><<< 28983 1726883051.52272: stdout chunk (state=3): >>><<< 28983 1726883051.52288: done transferring module to remote 28983 1726883051.52296: _low_level_execute_command(): starting 28983 1726883051.52302: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726883051.457133-31965-49739636504835/ /root/.ansible/tmp/ansible-tmp-1726883051.457133-31965-49739636504835/AnsiballZ_service_facts.py && sleep 0' 28983 1726883051.52710: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883051.52745: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883051.52749: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883051.52752: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration <<< 28983 1726883051.52756: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883051.52758: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883051.52812: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883051.52815: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883051.52892: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883051.54749: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883051.54789: stderr chunk (state=3): >>><<< 28983 1726883051.54792: stdout chunk (state=3): >>><<< 28983 1726883051.54806: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883051.54809: _low_level_execute_command(): starting 28983 1726883051.54819: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726883051.457133-31965-49739636504835/AnsiballZ_service_facts.py && sleep 0' 28983 1726883051.55233: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883051.55239: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883051.55241: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883051.55243: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883051.55301: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883051.55307: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883051.55385: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883053.53897: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"network": {"name": "network", "state": "running", "status": "enabled", "source": "sysv"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "import-state.service": {"name": "import-state.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "ip6tables.service": {"name": "ip6tables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "iptables.service": {"name": "iptables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "generated", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "<<< 28983 1726883053.53967: stdout chunk (state=3): >>>source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "sour<<< 28983 1726883053.53976: stdout chunk (state=3): >>>ce": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "loadmodules.service": {"name": "loadmodules.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 28983 1726883053.55945: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. <<< 28983 1726883053.55949: stdout chunk (state=3): >>><<< 28983 1726883053.55952: stderr chunk (state=3): >>><<< 28983 1726883053.55961: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"network": {"name": "network", "state": "running", "status": "enabled", "source": "sysv"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "import-state.service": {"name": "import-state.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "ip6tables.service": {"name": "ip6tables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "iptables.service": {"name": "iptables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "generated", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "loadmodules.service": {"name": "loadmodules.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726883053.65044: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726883051.457133-31965-49739636504835/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726883053.65049: _low_level_execute_command(): starting 28983 1726883053.65052: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726883051.457133-31965-49739636504835/ > /dev/null 2>&1 && sleep 0' 28983 1726883053.65531: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883053.65535: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883053.65540: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883053.65542: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883053.65599: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883053.65602: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883053.65606: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883053.65682: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883053.67669: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883053.67737: stderr chunk (state=3): >>><<< 28983 1726883053.67741: stdout chunk (state=3): >>><<< 28983 1726883053.67752: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883053.67763: handler run complete 28983 1726883053.68116: variable 'ansible_facts' from source: unknown 28983 1726883053.68358: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883053.68984: variable 'ansible_facts' from source: unknown 28983 1726883053.69118: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883053.69316: attempt loop complete, returning result 28983 1726883053.69322: _execute() done 28983 1726883053.69325: dumping result to json 28983 1726883053.69377: done dumping result, returning 28983 1726883053.69383: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running [0affe814-3a2d-b16d-c0a7-0000000014bb] 28983 1726883053.69388: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000014bb ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28983 1726883053.74527: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000014bb 28983 1726883053.74531: WORKER PROCESS EXITING 28983 1726883053.74542: no more pending results, returning what we have 28983 1726883053.74544: results queue empty 28983 1726883053.74545: checking for any_errors_fatal 28983 1726883053.74547: done checking for any_errors_fatal 28983 1726883053.74548: checking for max_fail_percentage 28983 1726883053.74549: done checking for max_fail_percentage 28983 1726883053.74549: checking to see if all hosts have failed and the running result is not ok 28983 1726883053.74550: done checking to see if all hosts have failed 28983 1726883053.74550: getting the remaining hosts for this loop 28983 1726883053.74551: done getting the remaining hosts for this loop 28983 1726883053.74554: getting the next task for host managed_node2 28983 1726883053.74558: done getting next task for host managed_node2 28983 1726883053.74560: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 28983 1726883053.74567: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883053.74575: getting variables 28983 1726883053.74576: in VariableManager get_vars() 28983 1726883053.74590: Calling all_inventory to load vars for managed_node2 28983 1726883053.74592: Calling groups_inventory to load vars for managed_node2 28983 1726883053.74593: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883053.74598: Calling all_plugins_play to load vars for managed_node2 28983 1726883053.74600: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883053.74602: Calling groups_plugins_play to load vars for managed_node2 28983 1726883053.75655: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883053.77209: done with get_vars() 28983 1726883053.77232: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:44:13 -0400 (0:00:02.359) 0:01:23.770 ****** 28983 1726883053.77300: entering _queue_task() for managed_node2/package_facts 28983 1726883053.77565: worker is 1 (out of 1 available) 28983 1726883053.77580: exiting _queue_task() for managed_node2/package_facts 28983 1726883053.77592: done queuing things up, now waiting for results queue to drain 28983 1726883053.77594: waiting for pending results... 28983 1726883053.77797: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 28983 1726883053.77926: in run() - task 0affe814-3a2d-b16d-c0a7-0000000014bc 28983 1726883053.77941: variable 'ansible_search_path' from source: unknown 28983 1726883053.77944: variable 'ansible_search_path' from source: unknown 28983 1726883053.77975: calling self._execute() 28983 1726883053.78067: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883053.78073: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883053.78087: variable 'omit' from source: magic vars 28983 1726883053.78425: variable 'ansible_distribution_major_version' from source: facts 28983 1726883053.78438: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883053.78444: variable 'omit' from source: magic vars 28983 1726883053.78511: variable 'omit' from source: magic vars 28983 1726883053.78541: variable 'omit' from source: magic vars 28983 1726883053.78582: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883053.78616: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883053.78638: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883053.78654: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883053.78665: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883053.78698: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883053.78702: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883053.78705: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883053.78789: Set connection var ansible_connection to ssh 28983 1726883053.78799: Set connection var ansible_shell_executable to /bin/sh 28983 1726883053.78807: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883053.78822: Set connection var ansible_timeout to 10 28983 1726883053.78825: Set connection var ansible_pipelining to False 28983 1726883053.78828: Set connection var ansible_shell_type to sh 28983 1726883053.78850: variable 'ansible_shell_executable' from source: unknown 28983 1726883053.78853: variable 'ansible_connection' from source: unknown 28983 1726883053.78857: variable 'ansible_module_compression' from source: unknown 28983 1726883053.78859: variable 'ansible_shell_type' from source: unknown 28983 1726883053.78862: variable 'ansible_shell_executable' from source: unknown 28983 1726883053.78867: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883053.78872: variable 'ansible_pipelining' from source: unknown 28983 1726883053.78878: variable 'ansible_timeout' from source: unknown 28983 1726883053.78884: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883053.79050: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28983 1726883053.79060: variable 'omit' from source: magic vars 28983 1726883053.79066: starting attempt loop 28983 1726883053.79070: running the handler 28983 1726883053.79086: _low_level_execute_command(): starting 28983 1726883053.79094: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726883053.79640: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883053.79644: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883053.79648: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883053.79651: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883053.79710: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883053.79715: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883053.79719: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883053.79790: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883053.81563: stdout chunk (state=3): >>>/root <<< 28983 1726883053.81676: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883053.81727: stderr chunk (state=3): >>><<< 28983 1726883053.81731: stdout chunk (state=3): >>><<< 28983 1726883053.81755: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883053.81768: _low_level_execute_command(): starting 28983 1726883053.81779: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726883053.8175354-32096-136557709626203 `" && echo ansible-tmp-1726883053.8175354-32096-136557709626203="` echo /root/.ansible/tmp/ansible-tmp-1726883053.8175354-32096-136557709626203 `" ) && sleep 0' 28983 1726883053.82231: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883053.82236: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883053.82239: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883053.82249: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883053.82298: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883053.82302: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883053.82379: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883053.84367: stdout chunk (state=3): >>>ansible-tmp-1726883053.8175354-32096-136557709626203=/root/.ansible/tmp/ansible-tmp-1726883053.8175354-32096-136557709626203 <<< 28983 1726883053.84488: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883053.84533: stderr chunk (state=3): >>><<< 28983 1726883053.84539: stdout chunk (state=3): >>><<< 28983 1726883053.84556: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726883053.8175354-32096-136557709626203=/root/.ansible/tmp/ansible-tmp-1726883053.8175354-32096-136557709626203 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883053.84591: variable 'ansible_module_compression' from source: unknown 28983 1726883053.84628: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 28983 1726883053.84676: variable 'ansible_facts' from source: unknown 28983 1726883053.84808: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726883053.8175354-32096-136557709626203/AnsiballZ_package_facts.py 28983 1726883053.84922: Sending initial data 28983 1726883053.84926: Sent initial data (162 bytes) 28983 1726883053.85381: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883053.85385: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726883053.85388: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration <<< 28983 1726883053.85392: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883053.85445: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883053.85450: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883053.85521: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883053.87142: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 28983 1726883053.87145: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726883053.87204: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726883053.87274: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmp32in_iee /root/.ansible/tmp/ansible-tmp-1726883053.8175354-32096-136557709626203/AnsiballZ_package_facts.py <<< 28983 1726883053.87277: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726883053.8175354-32096-136557709626203/AnsiballZ_package_facts.py" <<< 28983 1726883053.87345: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmp32in_iee" to remote "/root/.ansible/tmp/ansible-tmp-1726883053.8175354-32096-136557709626203/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726883053.8175354-32096-136557709626203/AnsiballZ_package_facts.py" <<< 28983 1726883053.89150: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883053.89207: stderr chunk (state=3): >>><<< 28983 1726883053.89211: stdout chunk (state=3): >>><<< 28983 1726883053.89228: done transferring module to remote 28983 1726883053.89240: _low_level_execute_command(): starting 28983 1726883053.89245: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726883053.8175354-32096-136557709626203/ /root/.ansible/tmp/ansible-tmp-1726883053.8175354-32096-136557709626203/AnsiballZ_package_facts.py && sleep 0' 28983 1726883053.89678: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883053.89683: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726883053.89685: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883053.89688: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883053.89690: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883053.89748: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883053.89753: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883053.89815: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883053.91669: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883053.91715: stderr chunk (state=3): >>><<< 28983 1726883053.91719: stdout chunk (state=3): >>><<< 28983 1726883053.91729: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883053.91732: _low_level_execute_command(): starting 28983 1726883053.91740: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726883053.8175354-32096-136557709626203/AnsiballZ_package_facts.py && sleep 0' 28983 1726883053.92139: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883053.92163: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration <<< 28983 1726883053.92166: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883053.92224: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883053.92236: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883053.92302: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883054.55856: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "r<<< 28983 1726883054.56023: stdout chunk (state=3): >>>pm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", <<< 28983 1726883054.56089: stdout chunk (state=3): >>>"release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "9.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chkconfig": [{"name": "chkconfig", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts": [{"name": "initscripts", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "network-scripts": [{"name": "network-scripts", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 28983 1726883054.57913: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883054.57946: stderr chunk (state=3): >>>Shared connection to 10.31.46.139 closed. <<< 28983 1726883054.58041: stderr chunk (state=3): >>><<< 28983 1726883054.58093: stdout chunk (state=3): >>><<< 28983 1726883054.58107: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "9.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chkconfig": [{"name": "chkconfig", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts": [{"name": "initscripts", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "network-scripts": [{"name": "network-scripts", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726883054.62767: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726883053.8175354-32096-136557709626203/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726883054.62771: _low_level_execute_command(): starting 28983 1726883054.62774: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726883053.8175354-32096-136557709626203/ > /dev/null 2>&1 && sleep 0' 28983 1726883054.63435: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883054.63527: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883054.63550: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883054.63602: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883054.63618: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883054.63726: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883054.65799: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883054.65836: stdout chunk (state=3): >>><<< 28983 1726883054.65855: stderr chunk (state=3): >>><<< 28983 1726883054.65876: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883054.65889: handler run complete 28983 1726883054.68837: variable 'ansible_facts' from source: unknown 28983 1726883054.69784: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883054.74841: variable 'ansible_facts' from source: unknown 28983 1726883054.75665: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883054.77241: attempt loop complete, returning result 28983 1726883054.77280: _execute() done 28983 1726883054.77291: dumping result to json 28983 1726883054.77684: done dumping result, returning 28983 1726883054.77708: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affe814-3a2d-b16d-c0a7-0000000014bc] 28983 1726883054.77720: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000014bc 28983 1726883054.84044: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000014bc 28983 1726883054.84048: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28983 1726883054.84236: no more pending results, returning what we have 28983 1726883054.84240: results queue empty 28983 1726883054.84244: checking for any_errors_fatal 28983 1726883054.84251: done checking for any_errors_fatal 28983 1726883054.84253: checking for max_fail_percentage 28983 1726883054.84255: done checking for max_fail_percentage 28983 1726883054.84256: checking to see if all hosts have failed and the running result is not ok 28983 1726883054.84257: done checking to see if all hosts have failed 28983 1726883054.84258: getting the remaining hosts for this loop 28983 1726883054.84261: done getting the remaining hosts for this loop 28983 1726883054.84265: getting the next task for host managed_node2 28983 1726883054.84276: done getting next task for host managed_node2 28983 1726883054.84281: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 28983 1726883054.84288: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883054.84302: getting variables 28983 1726883054.84304: in VariableManager get_vars() 28983 1726883054.84342: Calling all_inventory to load vars for managed_node2 28983 1726883054.84346: Calling groups_inventory to load vars for managed_node2 28983 1726883054.84353: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883054.84363: Calling all_plugins_play to load vars for managed_node2 28983 1726883054.84367: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883054.84371: Calling groups_plugins_play to load vars for managed_node2 28983 1726883054.86563: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883054.89800: done with get_vars() 28983 1726883054.89838: done getting variables 28983 1726883054.89916: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:44:14 -0400 (0:00:01.126) 0:01:24.897 ****** 28983 1726883054.89966: entering _queue_task() for managed_node2/debug 28983 1726883054.90565: worker is 1 (out of 1 available) 28983 1726883054.90579: exiting _queue_task() for managed_node2/debug 28983 1726883054.90591: done queuing things up, now waiting for results queue to drain 28983 1726883054.90593: waiting for pending results... 28983 1726883054.90790: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 28983 1726883054.91013: in run() - task 0affe814-3a2d-b16d-c0a7-000000001460 28983 1726883054.91144: variable 'ansible_search_path' from source: unknown 28983 1726883054.91149: variable 'ansible_search_path' from source: unknown 28983 1726883054.91152: calling self._execute() 28983 1726883054.91237: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883054.91256: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883054.91284: variable 'omit' from source: magic vars 28983 1726883054.91781: variable 'ansible_distribution_major_version' from source: facts 28983 1726883054.91801: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883054.91808: variable 'omit' from source: magic vars 28983 1726883054.91882: variable 'omit' from source: magic vars 28983 1726883054.91965: variable 'network_provider' from source: set_fact 28983 1726883054.91984: variable 'omit' from source: magic vars 28983 1726883054.92024: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883054.92058: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883054.92088: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883054.92102: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883054.92112: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883054.92144: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883054.92147: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883054.92152: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883054.92233: Set connection var ansible_connection to ssh 28983 1726883054.92245: Set connection var ansible_shell_executable to /bin/sh 28983 1726883054.92256: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883054.92264: Set connection var ansible_timeout to 10 28983 1726883054.92270: Set connection var ansible_pipelining to False 28983 1726883054.92276: Set connection var ansible_shell_type to sh 28983 1726883054.92297: variable 'ansible_shell_executable' from source: unknown 28983 1726883054.92301: variable 'ansible_connection' from source: unknown 28983 1726883054.92303: variable 'ansible_module_compression' from source: unknown 28983 1726883054.92306: variable 'ansible_shell_type' from source: unknown 28983 1726883054.92310: variable 'ansible_shell_executable' from source: unknown 28983 1726883054.92314: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883054.92321: variable 'ansible_pipelining' from source: unknown 28983 1726883054.92323: variable 'ansible_timeout' from source: unknown 28983 1726883054.92328: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883054.92451: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883054.92463: variable 'omit' from source: magic vars 28983 1726883054.92469: starting attempt loop 28983 1726883054.92476: running the handler 28983 1726883054.92515: handler run complete 28983 1726883054.92529: attempt loop complete, returning result 28983 1726883054.92534: _execute() done 28983 1726883054.92537: dumping result to json 28983 1726883054.92546: done dumping result, returning 28983 1726883054.92551: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [0affe814-3a2d-b16d-c0a7-000000001460] 28983 1726883054.92557: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001460 28983 1726883054.92650: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001460 28983 1726883054.92653: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: Using network provider: nm 28983 1726883054.92747: no more pending results, returning what we have 28983 1726883054.92751: results queue empty 28983 1726883054.92752: checking for any_errors_fatal 28983 1726883054.92764: done checking for any_errors_fatal 28983 1726883054.92765: checking for max_fail_percentage 28983 1726883054.92767: done checking for max_fail_percentage 28983 1726883054.92768: checking to see if all hosts have failed and the running result is not ok 28983 1726883054.92769: done checking to see if all hosts have failed 28983 1726883054.92770: getting the remaining hosts for this loop 28983 1726883054.92774: done getting the remaining hosts for this loop 28983 1726883054.92779: getting the next task for host managed_node2 28983 1726883054.92787: done getting next task for host managed_node2 28983 1726883054.92791: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 28983 1726883054.92797: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883054.92809: getting variables 28983 1726883054.92811: in VariableManager get_vars() 28983 1726883054.92855: Calling all_inventory to load vars for managed_node2 28983 1726883054.92858: Calling groups_inventory to load vars for managed_node2 28983 1726883054.92860: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883054.92869: Calling all_plugins_play to load vars for managed_node2 28983 1726883054.92875: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883054.92878: Calling groups_plugins_play to load vars for managed_node2 28983 1726883054.94396: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883054.96442: done with get_vars() 28983 1726883054.96468: done getting variables 28983 1726883054.96517: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:44:14 -0400 (0:00:00.065) 0:01:24.963 ****** 28983 1726883054.96553: entering _queue_task() for managed_node2/fail 28983 1726883054.96802: worker is 1 (out of 1 available) 28983 1726883054.96817: exiting _queue_task() for managed_node2/fail 28983 1726883054.96829: done queuing things up, now waiting for results queue to drain 28983 1726883054.96831: waiting for pending results... 28983 1726883054.97031: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 28983 1726883054.97159: in run() - task 0affe814-3a2d-b16d-c0a7-000000001461 28983 1726883054.97177: variable 'ansible_search_path' from source: unknown 28983 1726883054.97181: variable 'ansible_search_path' from source: unknown 28983 1726883054.97211: calling self._execute() 28983 1726883054.97299: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883054.97306: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883054.97315: variable 'omit' from source: magic vars 28983 1726883054.97641: variable 'ansible_distribution_major_version' from source: facts 28983 1726883054.97653: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883054.97760: variable 'network_state' from source: role '' defaults 28983 1726883054.97769: Evaluated conditional (network_state != {}): False 28983 1726883054.97775: when evaluation is False, skipping this task 28983 1726883054.97778: _execute() done 28983 1726883054.97781: dumping result to json 28983 1726883054.97784: done dumping result, returning 28983 1726883054.97791: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affe814-3a2d-b16d-c0a7-000000001461] 28983 1726883054.97797: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001461 28983 1726883054.97898: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001461 28983 1726883054.97901: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28983 1726883054.97959: no more pending results, returning what we have 28983 1726883054.97963: results queue empty 28983 1726883054.97964: checking for any_errors_fatal 28983 1726883054.97970: done checking for any_errors_fatal 28983 1726883054.97971: checking for max_fail_percentage 28983 1726883054.97976: done checking for max_fail_percentage 28983 1726883054.97977: checking to see if all hosts have failed and the running result is not ok 28983 1726883054.97978: done checking to see if all hosts have failed 28983 1726883054.97979: getting the remaining hosts for this loop 28983 1726883054.97980: done getting the remaining hosts for this loop 28983 1726883054.97985: getting the next task for host managed_node2 28983 1726883054.97992: done getting next task for host managed_node2 28983 1726883054.97997: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 28983 1726883054.98004: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883054.98026: getting variables 28983 1726883054.98028: in VariableManager get_vars() 28983 1726883054.98066: Calling all_inventory to load vars for managed_node2 28983 1726883054.98069: Calling groups_inventory to load vars for managed_node2 28983 1726883054.98074: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883054.98083: Calling all_plugins_play to load vars for managed_node2 28983 1726883054.98087: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883054.98090: Calling groups_plugins_play to load vars for managed_node2 28983 1726883054.99648: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883055.01882: done with get_vars() 28983 1726883055.01905: done getting variables 28983 1726883055.01955: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:44:15 -0400 (0:00:00.054) 0:01:25.017 ****** 28983 1726883055.01986: entering _queue_task() for managed_node2/fail 28983 1726883055.02219: worker is 1 (out of 1 available) 28983 1726883055.02236: exiting _queue_task() for managed_node2/fail 28983 1726883055.02248: done queuing things up, now waiting for results queue to drain 28983 1726883055.02250: waiting for pending results... 28983 1726883055.02454: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 28983 1726883055.02582: in run() - task 0affe814-3a2d-b16d-c0a7-000000001462 28983 1726883055.02597: variable 'ansible_search_path' from source: unknown 28983 1726883055.02600: variable 'ansible_search_path' from source: unknown 28983 1726883055.02631: calling self._execute() 28983 1726883055.02740: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883055.02746: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883055.02761: variable 'omit' from source: magic vars 28983 1726883055.03154: variable 'ansible_distribution_major_version' from source: facts 28983 1726883055.03339: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883055.03343: variable 'network_state' from source: role '' defaults 28983 1726883055.03353: Evaluated conditional (network_state != {}): False 28983 1726883055.03361: when evaluation is False, skipping this task 28983 1726883055.03368: _execute() done 28983 1726883055.03379: dumping result to json 28983 1726883055.03388: done dumping result, returning 28983 1726883055.03399: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affe814-3a2d-b16d-c0a7-000000001462] 28983 1726883055.03411: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001462 skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28983 1726883055.03592: no more pending results, returning what we have 28983 1726883055.03597: results queue empty 28983 1726883055.03598: checking for any_errors_fatal 28983 1726883055.03608: done checking for any_errors_fatal 28983 1726883055.03609: checking for max_fail_percentage 28983 1726883055.03611: done checking for max_fail_percentage 28983 1726883055.03612: checking to see if all hosts have failed and the running result is not ok 28983 1726883055.03613: done checking to see if all hosts have failed 28983 1726883055.03614: getting the remaining hosts for this loop 28983 1726883055.03617: done getting the remaining hosts for this loop 28983 1726883055.03622: getting the next task for host managed_node2 28983 1726883055.03632: done getting next task for host managed_node2 28983 1726883055.03639: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 28983 1726883055.03646: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883055.03680: getting variables 28983 1726883055.03683: in VariableManager get_vars() 28983 1726883055.03730: Calling all_inventory to load vars for managed_node2 28983 1726883055.04028: Calling groups_inventory to load vars for managed_node2 28983 1726883055.04035: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883055.04042: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001462 28983 1726883055.04046: WORKER PROCESS EXITING 28983 1726883055.04054: Calling all_plugins_play to load vars for managed_node2 28983 1726883055.04058: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883055.04062: Calling groups_plugins_play to load vars for managed_node2 28983 1726883055.05883: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883055.08153: done with get_vars() 28983 1726883055.08179: done getting variables 28983 1726883055.08231: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:44:15 -0400 (0:00:00.062) 0:01:25.080 ****** 28983 1726883055.08265: entering _queue_task() for managed_node2/fail 28983 1726883055.08546: worker is 1 (out of 1 available) 28983 1726883055.08561: exiting _queue_task() for managed_node2/fail 28983 1726883055.08578: done queuing things up, now waiting for results queue to drain 28983 1726883055.08580: waiting for pending results... 28983 1726883055.08800: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 28983 1726883055.09029: in run() - task 0affe814-3a2d-b16d-c0a7-000000001463 28983 1726883055.09066: variable 'ansible_search_path' from source: unknown 28983 1726883055.09081: variable 'ansible_search_path' from source: unknown 28983 1726883055.09128: calling self._execute() 28983 1726883055.09341: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883055.09345: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883055.09348: variable 'omit' from source: magic vars 28983 1726883055.09831: variable 'ansible_distribution_major_version' from source: facts 28983 1726883055.09858: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883055.10049: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883055.11921: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883055.11985: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883055.12016: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883055.12050: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883055.12074: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883055.12148: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883055.12215: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883055.12238: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883055.12304: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883055.12333: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883055.12467: variable 'ansible_distribution_major_version' from source: facts 28983 1726883055.12475: Evaluated conditional (ansible_distribution_major_version | int > 9): True 28983 1726883055.12659: variable 'ansible_distribution' from source: facts 28983 1726883055.12663: variable '__network_rh_distros' from source: role '' defaults 28983 1726883055.12738: Evaluated conditional (ansible_distribution in __network_rh_distros): False 28983 1726883055.12742: when evaluation is False, skipping this task 28983 1726883055.12747: _execute() done 28983 1726883055.12750: dumping result to json 28983 1726883055.12755: done dumping result, returning 28983 1726883055.12758: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affe814-3a2d-b16d-c0a7-000000001463] 28983 1726883055.12761: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001463 skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution in __network_rh_distros", "skip_reason": "Conditional result was False" } 28983 1726883055.12948: no more pending results, returning what we have 28983 1726883055.12952: results queue empty 28983 1726883055.12953: checking for any_errors_fatal 28983 1726883055.12961: done checking for any_errors_fatal 28983 1726883055.12962: checking for max_fail_percentage 28983 1726883055.12964: done checking for max_fail_percentage 28983 1726883055.12968: checking to see if all hosts have failed and the running result is not ok 28983 1726883055.12969: done checking to see if all hosts have failed 28983 1726883055.12970: getting the remaining hosts for this loop 28983 1726883055.12974: done getting the remaining hosts for this loop 28983 1726883055.12980: getting the next task for host managed_node2 28983 1726883055.12989: done getting next task for host managed_node2 28983 1726883055.12998: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 28983 1726883055.13004: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883055.13032: getting variables 28983 1726883055.13037: in VariableManager get_vars() 28983 1726883055.13099: Calling all_inventory to load vars for managed_node2 28983 1726883055.13103: Calling groups_inventory to load vars for managed_node2 28983 1726883055.13106: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883055.13185: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001463 28983 1726883055.13187: WORKER PROCESS EXITING 28983 1726883055.13197: Calling all_plugins_play to load vars for managed_node2 28983 1726883055.13202: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883055.13206: Calling groups_plugins_play to load vars for managed_node2 28983 1726883055.15494: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883055.20633: done with get_vars() 28983 1726883055.20717: done getting variables 28983 1726883055.20844: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:44:15 -0400 (0:00:00.126) 0:01:25.206 ****** 28983 1726883055.20926: entering _queue_task() for managed_node2/dnf 28983 1726883055.21528: worker is 1 (out of 1 available) 28983 1726883055.21647: exiting _queue_task() for managed_node2/dnf 28983 1726883055.21663: done queuing things up, now waiting for results queue to drain 28983 1726883055.21665: waiting for pending results... 28983 1726883055.21959: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 28983 1726883055.22226: in run() - task 0affe814-3a2d-b16d-c0a7-000000001464 28983 1726883055.22296: variable 'ansible_search_path' from source: unknown 28983 1726883055.22300: variable 'ansible_search_path' from source: unknown 28983 1726883055.22319: calling self._execute() 28983 1726883055.22468: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883055.22513: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883055.22522: variable 'omit' from source: magic vars 28983 1726883055.23239: variable 'ansible_distribution_major_version' from source: facts 28983 1726883055.23244: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883055.23414: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883055.27172: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883055.27259: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883055.27313: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883055.27363: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883055.27398: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883055.27494: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883055.27533: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883055.27584: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883055.27654: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883055.27676: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883055.27816: variable 'ansible_distribution' from source: facts 28983 1726883055.27848: variable 'ansible_distribution_major_version' from source: facts 28983 1726883055.27908: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 28983 1726883055.28158: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883055.28731: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883055.28771: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883055.28814: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883055.28875: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883055.28905: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883055.28975: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883055.29131: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883055.29138: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883055.29160: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883055.29187: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883055.29254: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883055.29300: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883055.29418: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883055.29484: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883055.29507: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883055.29756: variable 'network_connections' from source: include params 28983 1726883055.29777: variable 'interface' from source: play vars 28983 1726883055.29860: variable 'interface' from source: play vars 28983 1726883055.30052: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883055.30554: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883055.30557: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883055.30647: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883055.30781: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883055.30902: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883055.31040: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883055.31051: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883055.31127: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883055.31274: variable '__network_team_connections_defined' from source: role '' defaults 28983 1726883055.32061: variable 'network_connections' from source: include params 28983 1726883055.32076: variable 'interface' from source: play vars 28983 1726883055.32164: variable 'interface' from source: play vars 28983 1726883055.32309: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 28983 1726883055.32524: when evaluation is False, skipping this task 28983 1726883055.32527: _execute() done 28983 1726883055.32529: dumping result to json 28983 1726883055.32531: done dumping result, returning 28983 1726883055.32533: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affe814-3a2d-b16d-c0a7-000000001464] 28983 1726883055.32537: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001464 28983 1726883055.32950: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001464 28983 1726883055.32953: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 28983 1726883055.33029: no more pending results, returning what we have 28983 1726883055.33035: results queue empty 28983 1726883055.33040: checking for any_errors_fatal 28983 1726883055.33050: done checking for any_errors_fatal 28983 1726883055.33052: checking for max_fail_percentage 28983 1726883055.33054: done checking for max_fail_percentage 28983 1726883055.33055: checking to see if all hosts have failed and the running result is not ok 28983 1726883055.33056: done checking to see if all hosts have failed 28983 1726883055.33057: getting the remaining hosts for this loop 28983 1726883055.33060: done getting the remaining hosts for this loop 28983 1726883055.33070: getting the next task for host managed_node2 28983 1726883055.33084: done getting next task for host managed_node2 28983 1726883055.33089: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 28983 1726883055.33097: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883055.33131: getting variables 28983 1726883055.33536: in VariableManager get_vars() 28983 1726883055.33592: Calling all_inventory to load vars for managed_node2 28983 1726883055.33598: Calling groups_inventory to load vars for managed_node2 28983 1726883055.33601: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883055.33616: Calling all_plugins_play to load vars for managed_node2 28983 1726883055.33620: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883055.33625: Calling groups_plugins_play to load vars for managed_node2 28983 1726883055.39132: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883055.43101: done with get_vars() 28983 1726883055.43229: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 28983 1726883055.43344: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:44:15 -0400 (0:00:00.224) 0:01:25.431 ****** 28983 1726883055.43387: entering _queue_task() for managed_node2/yum 28983 1726883055.43947: worker is 1 (out of 1 available) 28983 1726883055.43963: exiting _queue_task() for managed_node2/yum 28983 1726883055.43978: done queuing things up, now waiting for results queue to drain 28983 1726883055.43980: waiting for pending results... 28983 1726883055.44569: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 28983 1726883055.45043: in run() - task 0affe814-3a2d-b16d-c0a7-000000001465 28983 1726883055.45138: variable 'ansible_search_path' from source: unknown 28983 1726883055.45142: variable 'ansible_search_path' from source: unknown 28983 1726883055.45178: calling self._execute() 28983 1726883055.45371: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883055.45455: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883055.45460: variable 'omit' from source: magic vars 28983 1726883055.46350: variable 'ansible_distribution_major_version' from source: facts 28983 1726883055.46353: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883055.46822: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883055.50017: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883055.50258: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883055.50543: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883055.50547: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883055.50572: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883055.50845: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883055.50859: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883055.50906: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883055.51067: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883055.51093: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883055.51432: variable 'ansible_distribution_major_version' from source: facts 28983 1726883055.51440: Evaluated conditional (ansible_distribution_major_version | int < 8): False 28983 1726883055.51450: when evaluation is False, skipping this task 28983 1726883055.51541: _execute() done 28983 1726883055.51547: dumping result to json 28983 1726883055.51550: done dumping result, returning 28983 1726883055.51554: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affe814-3a2d-b16d-c0a7-000000001465] 28983 1726883055.51557: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001465 skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 28983 1726883055.51954: no more pending results, returning what we have 28983 1726883055.51958: results queue empty 28983 1726883055.51959: checking for any_errors_fatal 28983 1726883055.51970: done checking for any_errors_fatal 28983 1726883055.51971: checking for max_fail_percentage 28983 1726883055.51973: done checking for max_fail_percentage 28983 1726883055.51974: checking to see if all hosts have failed and the running result is not ok 28983 1726883055.51976: done checking to see if all hosts have failed 28983 1726883055.51976: getting the remaining hosts for this loop 28983 1726883055.51979: done getting the remaining hosts for this loop 28983 1726883055.51984: getting the next task for host managed_node2 28983 1726883055.52003: done getting next task for host managed_node2 28983 1726883055.52009: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 28983 1726883055.52016: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883055.52049: getting variables 28983 1726883055.52052: in VariableManager get_vars() 28983 1726883055.52102: Calling all_inventory to load vars for managed_node2 28983 1726883055.52356: Calling groups_inventory to load vars for managed_node2 28983 1726883055.52361: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883055.52408: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001465 28983 1726883055.52412: WORKER PROCESS EXITING 28983 1726883055.52423: Calling all_plugins_play to load vars for managed_node2 28983 1726883055.52428: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883055.52433: Calling groups_plugins_play to load vars for managed_node2 28983 1726883055.55659: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883055.59143: done with get_vars() 28983 1726883055.59187: done getting variables 28983 1726883055.59266: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:44:15 -0400 (0:00:00.159) 0:01:25.590 ****** 28983 1726883055.59322: entering _queue_task() for managed_node2/fail 28983 1726883055.59943: worker is 1 (out of 1 available) 28983 1726883055.59955: exiting _queue_task() for managed_node2/fail 28983 1726883055.59967: done queuing things up, now waiting for results queue to drain 28983 1726883055.59969: waiting for pending results... 28983 1726883055.60111: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 28983 1726883055.60296: in run() - task 0affe814-3a2d-b16d-c0a7-000000001466 28983 1726883055.60325: variable 'ansible_search_path' from source: unknown 28983 1726883055.60330: variable 'ansible_search_path' from source: unknown 28983 1726883055.60372: calling self._execute() 28983 1726883055.60501: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883055.60508: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883055.60521: variable 'omit' from source: magic vars 28983 1726883055.61029: variable 'ansible_distribution_major_version' from source: facts 28983 1726883055.61044: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883055.61224: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883055.61640: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883055.64500: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883055.64596: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883055.64639: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883055.64696: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883055.64729: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883055.64836: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883055.64889: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883055.64936: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883055.64989: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883055.65013: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883055.65080: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883055.65110: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883055.65158: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883055.65211: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883055.65240: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883055.65441: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883055.65445: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883055.65448: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883055.65452: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883055.65459: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883055.65726: variable 'network_connections' from source: include params 28983 1726883055.65741: variable 'interface' from source: play vars 28983 1726883055.65839: variable 'interface' from source: play vars 28983 1726883055.65945: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883055.66186: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883055.66245: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883055.66283: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883055.66323: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883055.66381: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883055.66410: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883055.66457: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883055.66492: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883055.66560: variable '__network_team_connections_defined' from source: role '' defaults 28983 1726883055.66930: variable 'network_connections' from source: include params 28983 1726883055.66938: variable 'interface' from source: play vars 28983 1726883055.67028: variable 'interface' from source: play vars 28983 1726883055.67057: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 28983 1726883055.67062: when evaluation is False, skipping this task 28983 1726883055.67064: _execute() done 28983 1726883055.67067: dumping result to json 28983 1726883055.67240: done dumping result, returning 28983 1726883055.67243: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affe814-3a2d-b16d-c0a7-000000001466] 28983 1726883055.67245: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001466 28983 1726883055.67325: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001466 28983 1726883055.67329: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 28983 1726883055.67394: no more pending results, returning what we have 28983 1726883055.67397: results queue empty 28983 1726883055.67399: checking for any_errors_fatal 28983 1726883055.67520: done checking for any_errors_fatal 28983 1726883055.67522: checking for max_fail_percentage 28983 1726883055.67526: done checking for max_fail_percentage 28983 1726883055.67527: checking to see if all hosts have failed and the running result is not ok 28983 1726883055.67528: done checking to see if all hosts have failed 28983 1726883055.67529: getting the remaining hosts for this loop 28983 1726883055.67531: done getting the remaining hosts for this loop 28983 1726883055.67537: getting the next task for host managed_node2 28983 1726883055.67546: done getting next task for host managed_node2 28983 1726883055.67550: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 28983 1726883055.67557: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883055.67583: getting variables 28983 1726883055.67585: in VariableManager get_vars() 28983 1726883055.67632: Calling all_inventory to load vars for managed_node2 28983 1726883055.67755: Calling groups_inventory to load vars for managed_node2 28983 1726883055.67759: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883055.67769: Calling all_plugins_play to load vars for managed_node2 28983 1726883055.67776: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883055.67780: Calling groups_plugins_play to load vars for managed_node2 28983 1726883055.70227: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883055.73491: done with get_vars() 28983 1726883055.73529: done getting variables 28983 1726883055.73609: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:44:15 -0400 (0:00:00.143) 0:01:25.734 ****** 28983 1726883055.73663: entering _queue_task() for managed_node2/package 28983 1726883055.74056: worker is 1 (out of 1 available) 28983 1726883055.74070: exiting _queue_task() for managed_node2/package 28983 1726883055.74203: done queuing things up, now waiting for results queue to drain 28983 1726883055.74206: waiting for pending results... 28983 1726883055.74455: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 28983 1726883055.74643: in run() - task 0affe814-3a2d-b16d-c0a7-000000001467 28983 1726883055.74668: variable 'ansible_search_path' from source: unknown 28983 1726883055.74672: variable 'ansible_search_path' from source: unknown 28983 1726883055.74715: calling self._execute() 28983 1726883055.74845: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883055.74941: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883055.74945: variable 'omit' from source: magic vars 28983 1726883055.75367: variable 'ansible_distribution_major_version' from source: facts 28983 1726883055.75383: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883055.75685: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883055.76042: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883055.76110: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883055.76157: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883055.76250: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883055.76414: variable 'network_packages' from source: role '' defaults 28983 1726883055.76563: variable '__network_provider_setup' from source: role '' defaults 28983 1726883055.76640: variable '__network_service_name_default_nm' from source: role '' defaults 28983 1726883055.76668: variable '__network_service_name_default_nm' from source: role '' defaults 28983 1726883055.76681: variable '__network_packages_default_nm' from source: role '' defaults 28983 1726883055.76767: variable '__network_packages_default_nm' from source: role '' defaults 28983 1726883055.77073: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883055.80903: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883055.81113: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883055.81117: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883055.81120: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883055.81136: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883055.81236: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883055.81286: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883055.81319: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883055.81386: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883055.81405: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883055.81467: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883055.81507: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883055.81538: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883055.81607: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883055.81627: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883055.81971: variable '__network_packages_default_gobject_packages' from source: role '' defaults 28983 1726883055.82137: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883055.82298: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883055.82301: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883055.82304: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883055.82307: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883055.82408: variable 'ansible_python' from source: facts 28983 1726883055.82427: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 28983 1726883055.82540: variable '__network_wpa_supplicant_required' from source: role '' defaults 28983 1726883055.82654: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 28983 1726883055.82851: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883055.82888: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883055.82927: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883055.82984: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883055.83009: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883055.83080: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883055.83114: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883055.83155: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883055.83219: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883055.83247: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883055.83458: variable 'network_connections' from source: include params 28983 1726883055.83539: variable 'interface' from source: play vars 28983 1726883055.83591: variable 'interface' from source: play vars 28983 1726883055.83686: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883055.83718: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883055.83759: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883055.83814: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883055.83871: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883055.84284: variable 'network_connections' from source: include params 28983 1726883055.84290: variable 'interface' from source: play vars 28983 1726883055.84530: variable 'interface' from source: play vars 28983 1726883055.84622: variable '__network_packages_default_wireless' from source: role '' defaults 28983 1726883055.84690: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883055.85496: variable 'network_connections' from source: include params 28983 1726883055.85500: variable 'interface' from source: play vars 28983 1726883055.85502: variable 'interface' from source: play vars 28983 1726883055.85505: variable '__network_packages_default_team' from source: role '' defaults 28983 1726883055.85507: variable '__network_team_connections_defined' from source: role '' defaults 28983 1726883055.85950: variable 'network_connections' from source: include params 28983 1726883055.85955: variable 'interface' from source: play vars 28983 1726883055.85987: variable 'interface' from source: play vars 28983 1726883055.86066: variable '__network_service_name_default_initscripts' from source: role '' defaults 28983 1726883055.86154: variable '__network_service_name_default_initscripts' from source: role '' defaults 28983 1726883055.86162: variable '__network_packages_default_initscripts' from source: role '' defaults 28983 1726883055.86252: variable '__network_packages_default_initscripts' from source: role '' defaults 28983 1726883055.86591: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 28983 1726883055.87326: variable 'network_connections' from source: include params 28983 1726883055.87331: variable 'interface' from source: play vars 28983 1726883055.87419: variable 'interface' from source: play vars 28983 1726883055.87428: variable 'ansible_distribution' from source: facts 28983 1726883055.87432: variable '__network_rh_distros' from source: role '' defaults 28983 1726883055.87444: variable 'ansible_distribution_major_version' from source: facts 28983 1726883055.87461: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 28983 1726883055.87756: variable 'ansible_distribution' from source: facts 28983 1726883055.87759: variable '__network_rh_distros' from source: role '' defaults 28983 1726883055.87762: variable 'ansible_distribution_major_version' from source: facts 28983 1726883055.87764: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 28983 1726883055.88045: variable 'ansible_distribution' from source: facts 28983 1726883055.88049: variable '__network_rh_distros' from source: role '' defaults 28983 1726883055.88051: variable 'ansible_distribution_major_version' from source: facts 28983 1726883055.88068: variable 'network_provider' from source: set_fact 28983 1726883055.88097: variable 'ansible_facts' from source: unknown 28983 1726883055.89348: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 28983 1726883055.89358: when evaluation is False, skipping this task 28983 1726883055.89366: _execute() done 28983 1726883055.89377: dumping result to json 28983 1726883055.89386: done dumping result, returning 28983 1726883055.89399: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [0affe814-3a2d-b16d-c0a7-000000001467] 28983 1726883055.89409: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001467 skipping: [managed_node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 28983 1726883055.89610: no more pending results, returning what we have 28983 1726883055.89614: results queue empty 28983 1726883055.89616: checking for any_errors_fatal 28983 1726883055.89626: done checking for any_errors_fatal 28983 1726883055.89628: checking for max_fail_percentage 28983 1726883055.89630: done checking for max_fail_percentage 28983 1726883055.89631: checking to see if all hosts have failed and the running result is not ok 28983 1726883055.89632: done checking to see if all hosts have failed 28983 1726883055.89633: getting the remaining hosts for this loop 28983 1726883055.89741: done getting the remaining hosts for this loop 28983 1726883055.89763: getting the next task for host managed_node2 28983 1726883055.89783: done getting next task for host managed_node2 28983 1726883055.89792: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 28983 1726883055.89805: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883055.89865: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001467 28983 1726883055.89869: WORKER PROCESS EXITING 28983 1726883055.89906: getting variables 28983 1726883055.89909: in VariableManager get_vars() 28983 1726883055.90148: Calling all_inventory to load vars for managed_node2 28983 1726883055.90153: Calling groups_inventory to load vars for managed_node2 28983 1726883055.90156: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883055.90167: Calling all_plugins_play to load vars for managed_node2 28983 1726883055.90171: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883055.90178: Calling groups_plugins_play to load vars for managed_node2 28983 1726883055.96091: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883056.04883: done with get_vars() 28983 1726883056.05144: done getting variables 28983 1726883056.05338: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:44:16 -0400 (0:00:00.317) 0:01:26.051 ****** 28983 1726883056.05384: entering _queue_task() for managed_node2/package 28983 1726883056.07125: worker is 1 (out of 1 available) 28983 1726883056.07144: exiting _queue_task() for managed_node2/package 28983 1726883056.07158: done queuing things up, now waiting for results queue to drain 28983 1726883056.07160: waiting for pending results... 28983 1726883056.08089: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 28983 1726883056.08320: in run() - task 0affe814-3a2d-b16d-c0a7-000000001468 28983 1726883056.08584: variable 'ansible_search_path' from source: unknown 28983 1726883056.08588: variable 'ansible_search_path' from source: unknown 28983 1726883056.08704: calling self._execute() 28983 1726883056.09247: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883056.09252: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883056.09255: variable 'omit' from source: magic vars 28983 1726883056.10339: variable 'ansible_distribution_major_version' from source: facts 28983 1726883056.10359: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883056.10726: variable 'network_state' from source: role '' defaults 28983 1726883056.10747: Evaluated conditional (network_state != {}): False 28983 1726883056.10755: when evaluation is False, skipping this task 28983 1726883056.10763: _execute() done 28983 1726883056.10775: dumping result to json 28983 1726883056.10789: done dumping result, returning 28983 1726883056.10802: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affe814-3a2d-b16d-c0a7-000000001468] 28983 1726883056.10813: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001468 28983 1726883056.11320: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001468 28983 1726883056.11325: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28983 1726883056.11386: no more pending results, returning what we have 28983 1726883056.11391: results queue empty 28983 1726883056.11392: checking for any_errors_fatal 28983 1726883056.11400: done checking for any_errors_fatal 28983 1726883056.11401: checking for max_fail_percentage 28983 1726883056.11403: done checking for max_fail_percentage 28983 1726883056.11404: checking to see if all hosts have failed and the running result is not ok 28983 1726883056.11405: done checking to see if all hosts have failed 28983 1726883056.11406: getting the remaining hosts for this loop 28983 1726883056.11409: done getting the remaining hosts for this loop 28983 1726883056.11414: getting the next task for host managed_node2 28983 1726883056.11429: done getting next task for host managed_node2 28983 1726883056.11437: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 28983 1726883056.11443: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883056.11473: getting variables 28983 1726883056.11475: in VariableManager get_vars() 28983 1726883056.11524: Calling all_inventory to load vars for managed_node2 28983 1726883056.11640: Calling groups_inventory to load vars for managed_node2 28983 1726883056.11645: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883056.11654: Calling all_plugins_play to load vars for managed_node2 28983 1726883056.11658: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883056.11661: Calling groups_plugins_play to load vars for managed_node2 28983 1726883056.15514: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883056.21475: done with get_vars() 28983 1726883056.21511: done getting variables 28983 1726883056.21693: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:44:16 -0400 (0:00:00.163) 0:01:26.215 ****** 28983 1726883056.21737: entering _queue_task() for managed_node2/package 28983 1726883056.22398: worker is 1 (out of 1 available) 28983 1726883056.22413: exiting _queue_task() for managed_node2/package 28983 1726883056.22427: done queuing things up, now waiting for results queue to drain 28983 1726883056.22429: waiting for pending results... 28983 1726883056.22787: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 28983 1726883056.22952: in run() - task 0affe814-3a2d-b16d-c0a7-000000001469 28983 1726883056.23141: variable 'ansible_search_path' from source: unknown 28983 1726883056.23145: variable 'ansible_search_path' from source: unknown 28983 1726883056.23148: calling self._execute() 28983 1726883056.23166: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883056.23170: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883056.23177: variable 'omit' from source: magic vars 28983 1726883056.23676: variable 'ansible_distribution_major_version' from source: facts 28983 1726883056.23688: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883056.23857: variable 'network_state' from source: role '' defaults 28983 1726883056.23879: Evaluated conditional (network_state != {}): False 28983 1726883056.23883: when evaluation is False, skipping this task 28983 1726883056.23886: _execute() done 28983 1726883056.23889: dumping result to json 28983 1726883056.23894: done dumping result, returning 28983 1726883056.23902: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affe814-3a2d-b16d-c0a7-000000001469] 28983 1726883056.23910: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001469 28983 1726883056.24030: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001469 28983 1726883056.24035: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28983 1726883056.24254: no more pending results, returning what we have 28983 1726883056.24258: results queue empty 28983 1726883056.24260: checking for any_errors_fatal 28983 1726883056.24266: done checking for any_errors_fatal 28983 1726883056.24268: checking for max_fail_percentage 28983 1726883056.24269: done checking for max_fail_percentage 28983 1726883056.24270: checking to see if all hosts have failed and the running result is not ok 28983 1726883056.24271: done checking to see if all hosts have failed 28983 1726883056.24272: getting the remaining hosts for this loop 28983 1726883056.24274: done getting the remaining hosts for this loop 28983 1726883056.24279: getting the next task for host managed_node2 28983 1726883056.24288: done getting next task for host managed_node2 28983 1726883056.24293: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 28983 1726883056.24299: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883056.24323: getting variables 28983 1726883056.24325: in VariableManager get_vars() 28983 1726883056.24370: Calling all_inventory to load vars for managed_node2 28983 1726883056.24374: Calling groups_inventory to load vars for managed_node2 28983 1726883056.24377: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883056.24388: Calling all_plugins_play to load vars for managed_node2 28983 1726883056.24392: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883056.24396: Calling groups_plugins_play to load vars for managed_node2 28983 1726883056.26814: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883056.31289: done with get_vars() 28983 1726883056.31325: done getting variables 28983 1726883056.31403: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:44:16 -0400 (0:00:00.097) 0:01:26.312 ****** 28983 1726883056.31468: entering _queue_task() for managed_node2/service 28983 1726883056.31928: worker is 1 (out of 1 available) 28983 1726883056.31944: exiting _queue_task() for managed_node2/service 28983 1726883056.31957: done queuing things up, now waiting for results queue to drain 28983 1726883056.31959: waiting for pending results... 28983 1726883056.32385: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 28983 1726883056.32390: in run() - task 0affe814-3a2d-b16d-c0a7-00000000146a 28983 1726883056.32400: variable 'ansible_search_path' from source: unknown 28983 1726883056.32410: variable 'ansible_search_path' from source: unknown 28983 1726883056.32462: calling self._execute() 28983 1726883056.32739: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883056.32744: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883056.32747: variable 'omit' from source: magic vars 28983 1726883056.33083: variable 'ansible_distribution_major_version' from source: facts 28983 1726883056.33099: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883056.33687: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883056.34791: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883056.39540: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883056.39633: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883056.39682: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883056.39723: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883056.39760: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883056.39864: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883056.39904: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883056.39936: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883056.39996: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883056.40014: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883056.40135: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883056.40140: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883056.40144: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883056.40197: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883056.40223: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883056.40287: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883056.40320: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883056.40352: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883056.40408: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883056.40460: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883056.40680: variable 'network_connections' from source: include params 28983 1726883056.40693: variable 'interface' from source: play vars 28983 1726883056.40949: variable 'interface' from source: play vars 28983 1726883056.41180: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883056.41564: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883056.41730: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883056.41894: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883056.41898: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883056.41944: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883056.41966: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883056.41997: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883056.42028: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883056.42426: variable '__network_team_connections_defined' from source: role '' defaults 28983 1726883056.43180: variable 'network_connections' from source: include params 28983 1726883056.43188: variable 'interface' from source: play vars 28983 1726883056.43414: variable 'interface' from source: play vars 28983 1726883056.43418: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 28983 1726883056.43421: when evaluation is False, skipping this task 28983 1726883056.43423: _execute() done 28983 1726883056.43425: dumping result to json 28983 1726883056.43427: done dumping result, returning 28983 1726883056.43429: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affe814-3a2d-b16d-c0a7-00000000146a] 28983 1726883056.43431: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000146a skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 28983 1726883056.43938: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000146a 28983 1726883056.43977: WORKER PROCESS EXITING 28983 1726883056.43957: no more pending results, returning what we have 28983 1726883056.44053: results queue empty 28983 1726883056.44055: checking for any_errors_fatal 28983 1726883056.44062: done checking for any_errors_fatal 28983 1726883056.44063: checking for max_fail_percentage 28983 1726883056.44065: done checking for max_fail_percentage 28983 1726883056.44066: checking to see if all hosts have failed and the running result is not ok 28983 1726883056.44067: done checking to see if all hosts have failed 28983 1726883056.44068: getting the remaining hosts for this loop 28983 1726883056.44070: done getting the remaining hosts for this loop 28983 1726883056.44075: getting the next task for host managed_node2 28983 1726883056.44085: done getting next task for host managed_node2 28983 1726883056.44090: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 28983 1726883056.44096: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883056.44120: getting variables 28983 1726883056.44122: in VariableManager get_vars() 28983 1726883056.44251: Calling all_inventory to load vars for managed_node2 28983 1726883056.44254: Calling groups_inventory to load vars for managed_node2 28983 1726883056.44257: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883056.44271: Calling all_plugins_play to load vars for managed_node2 28983 1726883056.44274: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883056.44278: Calling groups_plugins_play to load vars for managed_node2 28983 1726883056.49092: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883056.54793: done with get_vars() 28983 1726883056.54853: done getting variables 28983 1726883056.54923: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:44:16 -0400 (0:00:00.235) 0:01:26.547 ****** 28983 1726883056.54970: entering _queue_task() for managed_node2/service 28983 1726883056.55416: worker is 1 (out of 1 available) 28983 1726883056.55431: exiting _queue_task() for managed_node2/service 28983 1726883056.55596: done queuing things up, now waiting for results queue to drain 28983 1726883056.55598: waiting for pending results... 28983 1726883056.55781: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 28983 1726883056.56114: in run() - task 0affe814-3a2d-b16d-c0a7-00000000146b 28983 1726883056.56352: variable 'ansible_search_path' from source: unknown 28983 1726883056.56359: variable 'ansible_search_path' from source: unknown 28983 1726883056.56399: calling self._execute() 28983 1726883056.56642: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883056.56646: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883056.56649: variable 'omit' from source: magic vars 28983 1726883056.57406: variable 'ansible_distribution_major_version' from source: facts 28983 1726883056.57460: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883056.57920: variable 'network_provider' from source: set_fact 28983 1726883056.58017: variable 'network_state' from source: role '' defaults 28983 1726883056.58038: Evaluated conditional (network_provider == "nm" or network_state != {}): True 28983 1726883056.58072: variable 'omit' from source: magic vars 28983 1726883056.58266: variable 'omit' from source: magic vars 28983 1726883056.58398: variable 'network_service_name' from source: role '' defaults 28983 1726883056.58537: variable 'network_service_name' from source: role '' defaults 28983 1726883056.58703: variable '__network_provider_setup' from source: role '' defaults 28983 1726883056.58722: variable '__network_service_name_default_nm' from source: role '' defaults 28983 1726883056.58822: variable '__network_service_name_default_nm' from source: role '' defaults 28983 1726883056.58843: variable '__network_packages_default_nm' from source: role '' defaults 28983 1726883056.58944: variable '__network_packages_default_nm' from source: role '' defaults 28983 1726883056.59324: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883056.62459: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883056.62740: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883056.62744: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883056.62746: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883056.62748: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883056.62857: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883056.62910: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883056.62965: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883056.63046: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883056.63112: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883056.63217: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883056.63265: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883056.63319: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883056.63398: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883056.63523: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883056.63922: variable '__network_packages_default_gobject_packages' from source: role '' defaults 28983 1726883056.64124: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883056.64179: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883056.64222: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883056.64339: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883056.64342: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883056.64466: variable 'ansible_python' from source: facts 28983 1726883056.64500: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 28983 1726883056.64633: variable '__network_wpa_supplicant_required' from source: role '' defaults 28983 1726883056.64762: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 28983 1726883056.64966: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883056.65051: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883056.65054: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883056.65127: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883056.65156: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883056.65233: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883056.65337: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883056.65343: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883056.65395: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883056.65418: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883056.65628: variable 'network_connections' from source: include params 28983 1726883056.65644: variable 'interface' from source: play vars 28983 1726883056.65760: variable 'interface' from source: play vars 28983 1726883056.65943: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883056.66215: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883056.66340: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883056.66350: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883056.66407: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883056.66509: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883056.66556: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883056.66611: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883056.66661: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883056.66730: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883056.67218: variable 'network_connections' from source: include params 28983 1726883056.67221: variable 'interface' from source: play vars 28983 1726883056.67263: variable 'interface' from source: play vars 28983 1726883056.67307: variable '__network_packages_default_wireless' from source: role '' defaults 28983 1726883056.67418: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883056.67847: variable 'network_connections' from source: include params 28983 1726883056.67859: variable 'interface' from source: play vars 28983 1726883056.67959: variable 'interface' from source: play vars 28983 1726883056.67992: variable '__network_packages_default_team' from source: role '' defaults 28983 1726883056.68107: variable '__network_team_connections_defined' from source: role '' defaults 28983 1726883056.68552: variable 'network_connections' from source: include params 28983 1726883056.68556: variable 'interface' from source: play vars 28983 1726883056.68629: variable 'interface' from source: play vars 28983 1726883056.68714: variable '__network_service_name_default_initscripts' from source: role '' defaults 28983 1726883056.68807: variable '__network_service_name_default_initscripts' from source: role '' defaults 28983 1726883056.68840: variable '__network_packages_default_initscripts' from source: role '' defaults 28983 1726883056.68911: variable '__network_packages_default_initscripts' from source: role '' defaults 28983 1726883056.69288: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 28983 1726883056.70243: variable 'network_connections' from source: include params 28983 1726883056.70247: variable 'interface' from source: play vars 28983 1726883056.70298: variable 'interface' from source: play vars 28983 1726883056.70312: variable 'ansible_distribution' from source: facts 28983 1726883056.70321: variable '__network_rh_distros' from source: role '' defaults 28983 1726883056.70333: variable 'ansible_distribution_major_version' from source: facts 28983 1726883056.70368: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 28983 1726883056.70665: variable 'ansible_distribution' from source: facts 28983 1726883056.70693: variable '__network_rh_distros' from source: role '' defaults 28983 1726883056.70709: variable 'ansible_distribution_major_version' from source: facts 28983 1726883056.70827: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 28983 1726883056.71010: variable 'ansible_distribution' from source: facts 28983 1726883056.71024: variable '__network_rh_distros' from source: role '' defaults 28983 1726883056.71039: variable 'ansible_distribution_major_version' from source: facts 28983 1726883056.71098: variable 'network_provider' from source: set_fact 28983 1726883056.71133: variable 'omit' from source: magic vars 28983 1726883056.71186: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883056.71232: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883056.71264: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883056.71306: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883056.71327: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883056.71395: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883056.71441: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883056.71447: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883056.71577: Set connection var ansible_connection to ssh 28983 1726883056.71600: Set connection var ansible_shell_executable to /bin/sh 28983 1726883056.71621: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883056.71638: Set connection var ansible_timeout to 10 28983 1726883056.71709: Set connection var ansible_pipelining to False 28983 1726883056.71712: Set connection var ansible_shell_type to sh 28983 1726883056.71717: variable 'ansible_shell_executable' from source: unknown 28983 1726883056.71719: variable 'ansible_connection' from source: unknown 28983 1726883056.71721: variable 'ansible_module_compression' from source: unknown 28983 1726883056.71723: variable 'ansible_shell_type' from source: unknown 28983 1726883056.71725: variable 'ansible_shell_executable' from source: unknown 28983 1726883056.71727: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883056.71739: variable 'ansible_pipelining' from source: unknown 28983 1726883056.71844: variable 'ansible_timeout' from source: unknown 28983 1726883056.71847: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883056.71940: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883056.71975: variable 'omit' from source: magic vars 28983 1726883056.71993: starting attempt loop 28983 1726883056.72003: running the handler 28983 1726883056.72122: variable 'ansible_facts' from source: unknown 28983 1726883056.73682: _low_level_execute_command(): starting 28983 1726883056.73804: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726883056.74594: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883056.74650: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883056.74725: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883056.74746: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883056.74775: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883056.74870: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883056.76702: stdout chunk (state=3): >>>/root <<< 28983 1726883056.76901: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883056.76918: stderr chunk (state=3): >>><<< 28983 1726883056.76927: stdout chunk (state=3): >>><<< 28983 1726883056.76960: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883056.77008: _low_level_execute_command(): starting 28983 1726883056.77103: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726883056.769682-32203-97487528695481 `" && echo ansible-tmp-1726883056.769682-32203-97487528695481="` echo /root/.ansible/tmp/ansible-tmp-1726883056.769682-32203-97487528695481 `" ) && sleep 0' 28983 1726883056.77852: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883056.77855: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883056.77857: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883056.78099: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883056.78182: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883056.80285: stdout chunk (state=3): >>>ansible-tmp-1726883056.769682-32203-97487528695481=/root/.ansible/tmp/ansible-tmp-1726883056.769682-32203-97487528695481 <<< 28983 1726883056.80666: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883056.80670: stdout chunk (state=3): >>><<< 28983 1726883056.80674: stderr chunk (state=3): >>><<< 28983 1726883056.80678: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726883056.769682-32203-97487528695481=/root/.ansible/tmp/ansible-tmp-1726883056.769682-32203-97487528695481 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883056.80680: variable 'ansible_module_compression' from source: unknown 28983 1726883056.80682: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 28983 1726883056.80684: variable 'ansible_facts' from source: unknown 28983 1726883056.80907: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726883056.769682-32203-97487528695481/AnsiballZ_systemd.py 28983 1726883056.81278: Sending initial data 28983 1726883056.81398: Sent initial data (154 bytes) 28983 1726883056.82351: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883056.82354: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883056.82358: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883056.82489: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883056.82498: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883056.82501: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883056.82978: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883056.83041: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883056.84848: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 28983 1726883056.84853: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726883056.84948: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726883056.85015: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726883056.769682-32203-97487528695481/AnsiballZ_systemd.py" <<< 28983 1726883056.85027: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpkfvdavqf /root/.ansible/tmp/ansible-tmp-1726883056.769682-32203-97487528695481/AnsiballZ_systemd.py <<< 28983 1726883056.85084: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpkfvdavqf" to remote "/root/.ansible/tmp/ansible-tmp-1726883056.769682-32203-97487528695481/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726883056.769682-32203-97487528695481/AnsiballZ_systemd.py" <<< 28983 1726883056.89643: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883056.89646: stderr chunk (state=3): >>><<< 28983 1726883056.89649: stdout chunk (state=3): >>><<< 28983 1726883056.89659: done transferring module to remote 28983 1726883056.89674: _low_level_execute_command(): starting 28983 1726883056.89677: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726883056.769682-32203-97487528695481/ /root/.ansible/tmp/ansible-tmp-1726883056.769682-32203-97487528695481/AnsiballZ_systemd.py && sleep 0' 28983 1726883056.90302: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883056.90313: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883056.90324: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883056.90343: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883056.90362: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726883056.90537: stderr chunk (state=3): >>>debug2: match not found <<< 28983 1726883056.90541: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883056.90544: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28983 1726883056.90546: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.46.139 is address <<< 28983 1726883056.90548: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28983 1726883056.90550: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883056.90553: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883056.90555: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883056.90558: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726883056.90560: stderr chunk (state=3): >>>debug2: match found <<< 28983 1726883056.90562: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883056.90564: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883056.90566: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883056.90711: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883056.90808: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883056.92688: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883056.92770: stderr chunk (state=3): >>><<< 28983 1726883056.92777: stdout chunk (state=3): >>><<< 28983 1726883056.92894: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883056.92899: _low_level_execute_command(): starting 28983 1726883056.92902: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726883056.769682-32203-97487528695481/AnsiballZ_systemd.py && sleep 0' 28983 1726883056.93467: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883056.93549: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883056.93568: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883056.93613: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883056.93662: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883056.93747: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883057.26687: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3307", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ExecMainStartTimestampMonotonic": "237115079", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3307", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4837", "MemoryCurrent": "4526080", "MemoryAvailable": "infinity", "CPUUsageNSec": "1592579000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target NetworkManager-wait-online.service cloud-init.service network.target network.service multi-user.target", "After": "systemd-journald.socket dbus-broker.service dbus.socket system.slice cloud-init-local.service sysinit.target network-pre.target basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:40:53 EDT", "StateChangeTimestampMonotonic": "816797668", "InactiveExitTimestamp": "Fri 2024-09-20 21:31:13 EDT", "InactiveExitTimestampMonotonic": "237115438", "ActiveEnterTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ActiveEnterTimestampMonotonic": "237210316", "ActiveExitTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ActiveExitTimestampMonotonic": "237082173", "InactiveEnterTimestamp": "Fri 2024-09-20 21:31:13 EDT", "InactiveEnterTimestampMonotonic": "237109124", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ConditionTimestampMonotonic": "237109810", "AssertTimestamp": "Fri 2024-09-20 21:31:13 EDT", "AssertTimestampMonotonic": "237109813", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ac5f6302898c463683c4bdf580ab7e3e", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 28983 1726883057.28694: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. <<< 28983 1726883057.28789: stderr chunk (state=3): >>><<< 28983 1726883057.28887: stdout chunk (state=3): >>><<< 28983 1726883057.28891: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3307", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ExecMainStartTimestampMonotonic": "237115079", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3307", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4837", "MemoryCurrent": "4526080", "MemoryAvailable": "infinity", "CPUUsageNSec": "1592579000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target NetworkManager-wait-online.service cloud-init.service network.target network.service multi-user.target", "After": "systemd-journald.socket dbus-broker.service dbus.socket system.slice cloud-init-local.service sysinit.target network-pre.target basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:40:53 EDT", "StateChangeTimestampMonotonic": "816797668", "InactiveExitTimestamp": "Fri 2024-09-20 21:31:13 EDT", "InactiveExitTimestampMonotonic": "237115438", "ActiveEnterTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ActiveEnterTimestampMonotonic": "237210316", "ActiveExitTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ActiveExitTimestampMonotonic": "237082173", "InactiveEnterTimestamp": "Fri 2024-09-20 21:31:13 EDT", "InactiveEnterTimestampMonotonic": "237109124", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ConditionTimestampMonotonic": "237109810", "AssertTimestamp": "Fri 2024-09-20 21:31:13 EDT", "AssertTimestampMonotonic": "237109813", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ac5f6302898c463683c4bdf580ab7e3e", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726883057.29662: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726883056.769682-32203-97487528695481/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726883057.29697: _low_level_execute_command(): starting 28983 1726883057.29708: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726883056.769682-32203-97487528695481/ > /dev/null 2>&1 && sleep 0' 28983 1726883057.30974: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883057.31167: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883057.31391: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883057.31762: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883057.33853: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883057.33856: stdout chunk (state=3): >>><<< 28983 1726883057.33859: stderr chunk (state=3): >>><<< 28983 1726883057.34201: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883057.34204: handler run complete 28983 1726883057.34207: attempt loop complete, returning result 28983 1726883057.34216: _execute() done 28983 1726883057.34257: dumping result to json 28983 1726883057.34634: done dumping result, returning 28983 1726883057.34639: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affe814-3a2d-b16d-c0a7-00000000146b] 28983 1726883057.34642: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000146b ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28983 1726883057.35614: no more pending results, returning what we have 28983 1726883057.35618: results queue empty 28983 1726883057.35619: checking for any_errors_fatal 28983 1726883057.35630: done checking for any_errors_fatal 28983 1726883057.35631: checking for max_fail_percentage 28983 1726883057.35635: done checking for max_fail_percentage 28983 1726883057.35636: checking to see if all hosts have failed and the running result is not ok 28983 1726883057.35637: done checking to see if all hosts have failed 28983 1726883057.35638: getting the remaining hosts for this loop 28983 1726883057.35641: done getting the remaining hosts for this loop 28983 1726883057.35647: getting the next task for host managed_node2 28983 1726883057.35664: done getting next task for host managed_node2 28983 1726883057.35669: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 28983 1726883057.35676: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883057.35693: getting variables 28983 1726883057.35695: in VariableManager get_vars() 28983 1726883057.36183: Calling all_inventory to load vars for managed_node2 28983 1726883057.36187: Calling groups_inventory to load vars for managed_node2 28983 1726883057.36190: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883057.36201: Calling all_plugins_play to load vars for managed_node2 28983 1726883057.36205: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883057.36209: Calling groups_plugins_play to load vars for managed_node2 28983 1726883057.36864: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000146b 28983 1726883057.36867: WORKER PROCESS EXITING 28983 1726883057.43383: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883057.61630: done with get_vars() 28983 1726883057.61672: done getting variables 28983 1726883057.61747: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:44:17 -0400 (0:00:01.068) 0:01:27.615 ****** 28983 1726883057.61788: entering _queue_task() for managed_node2/service 28983 1726883057.62581: worker is 1 (out of 1 available) 28983 1726883057.62817: exiting _queue_task() for managed_node2/service 28983 1726883057.62830: done queuing things up, now waiting for results queue to drain 28983 1726883057.62833: waiting for pending results... 28983 1726883057.63303: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 28983 1726883057.63727: in run() - task 0affe814-3a2d-b16d-c0a7-00000000146c 28983 1726883057.63817: variable 'ansible_search_path' from source: unknown 28983 1726883057.63829: variable 'ansible_search_path' from source: unknown 28983 1726883057.63990: calling self._execute() 28983 1726883057.64191: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883057.64206: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883057.64226: variable 'omit' from source: magic vars 28983 1726883057.64761: variable 'ansible_distribution_major_version' from source: facts 28983 1726883057.64785: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883057.64998: variable 'network_provider' from source: set_fact 28983 1726883057.65003: Evaluated conditional (network_provider == "nm"): True 28983 1726883057.65120: variable '__network_wpa_supplicant_required' from source: role '' defaults 28983 1726883057.65247: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 28983 1726883057.65510: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883057.69592: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883057.69686: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883057.69753: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883057.69817: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883057.69917: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883057.70001: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883057.70085: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883057.70124: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883057.70196: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883057.70241: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883057.70358: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883057.70417: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883057.70489: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883057.70613: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883057.70632: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883057.70730: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883057.70770: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883057.70840: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883057.70903: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883057.70929: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883057.71270: variable 'network_connections' from source: include params 28983 1726883057.71391: variable 'interface' from source: play vars 28983 1726883057.71535: variable 'interface' from source: play vars 28983 1726883057.71944: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883057.72158: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883057.72338: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883057.72605: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883057.72608: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883057.72740: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883057.72779: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883057.72817: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883057.72895: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883057.73083: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883057.73731: variable 'network_connections' from source: include params 28983 1726883057.73746: variable 'interface' from source: play vars 28983 1726883057.73844: variable 'interface' from source: play vars 28983 1726883057.73887: Evaluated conditional (__network_wpa_supplicant_required): False 28983 1726883057.73896: when evaluation is False, skipping this task 28983 1726883057.73904: _execute() done 28983 1726883057.73941: dumping result to json 28983 1726883057.73944: done dumping result, returning 28983 1726883057.73947: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affe814-3a2d-b16d-c0a7-00000000146c] 28983 1726883057.74027: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000146c 28983 1726883057.74115: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000146c 28983 1726883057.74118: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 28983 1726883057.74195: no more pending results, returning what we have 28983 1726883057.74199: results queue empty 28983 1726883057.74201: checking for any_errors_fatal 28983 1726883057.74235: done checking for any_errors_fatal 28983 1726883057.74237: checking for max_fail_percentage 28983 1726883057.74239: done checking for max_fail_percentage 28983 1726883057.74240: checking to see if all hosts have failed and the running result is not ok 28983 1726883057.74241: done checking to see if all hosts have failed 28983 1726883057.74242: getting the remaining hosts for this loop 28983 1726883057.74244: done getting the remaining hosts for this loop 28983 1726883057.74250: getting the next task for host managed_node2 28983 1726883057.74261: done getting next task for host managed_node2 28983 1726883057.74266: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 28983 1726883057.74275: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883057.74432: getting variables 28983 1726883057.74436: in VariableManager get_vars() 28983 1726883057.74499: Calling all_inventory to load vars for managed_node2 28983 1726883057.74503: Calling groups_inventory to load vars for managed_node2 28983 1726883057.74508: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883057.74644: Calling all_plugins_play to load vars for managed_node2 28983 1726883057.74653: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883057.74658: Calling groups_plugins_play to load vars for managed_node2 28983 1726883057.78439: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883057.83260: done with get_vars() 28983 1726883057.83382: done getting variables 28983 1726883057.83666: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:44:17 -0400 (0:00:00.220) 0:01:27.836 ****** 28983 1726883057.83836: entering _queue_task() for managed_node2/service 28983 1726883057.85000: worker is 1 (out of 1 available) 28983 1726883057.85014: exiting _queue_task() for managed_node2/service 28983 1726883057.85027: done queuing things up, now waiting for results queue to drain 28983 1726883057.85029: waiting for pending results... 28983 1726883057.85999: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 28983 1726883057.86212: in run() - task 0affe814-3a2d-b16d-c0a7-00000000146d 28983 1726883057.86230: variable 'ansible_search_path' from source: unknown 28983 1726883057.86236: variable 'ansible_search_path' from source: unknown 28983 1726883057.86585: calling self._execute() 28983 1726883057.86624: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883057.86628: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883057.86631: variable 'omit' from source: magic vars 28983 1726883057.87894: variable 'ansible_distribution_major_version' from source: facts 28983 1726883057.87908: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883057.88318: variable 'network_provider' from source: set_fact 28983 1726883057.88331: Evaluated conditional (network_provider == "initscripts"): False 28983 1726883057.88343: when evaluation is False, skipping this task 28983 1726883057.88350: _execute() done 28983 1726883057.88358: dumping result to json 28983 1726883057.88365: done dumping result, returning 28983 1726883057.88382: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [0affe814-3a2d-b16d-c0a7-00000000146d] 28983 1726883057.88400: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000146d 28983 1726883057.88574: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000146d 28983 1726883057.88577: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28983 1726883057.88638: no more pending results, returning what we have 28983 1726883057.88642: results queue empty 28983 1726883057.88643: checking for any_errors_fatal 28983 1726883057.88655: done checking for any_errors_fatal 28983 1726883057.88656: checking for max_fail_percentage 28983 1726883057.88658: done checking for max_fail_percentage 28983 1726883057.88660: checking to see if all hosts have failed and the running result is not ok 28983 1726883057.88661: done checking to see if all hosts have failed 28983 1726883057.88662: getting the remaining hosts for this loop 28983 1726883057.88664: done getting the remaining hosts for this loop 28983 1726883057.88669: getting the next task for host managed_node2 28983 1726883057.88679: done getting next task for host managed_node2 28983 1726883057.88684: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 28983 1726883057.88691: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883057.88840: getting variables 28983 1726883057.88843: in VariableManager get_vars() 28983 1726883057.89026: Calling all_inventory to load vars for managed_node2 28983 1726883057.89029: Calling groups_inventory to load vars for managed_node2 28983 1726883057.89032: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883057.89079: Calling all_plugins_play to load vars for managed_node2 28983 1726883057.89085: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883057.89090: Calling groups_plugins_play to load vars for managed_node2 28983 1726883057.92127: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883057.96012: done with get_vars() 28983 1726883057.96061: done getting variables 28983 1726883057.96132: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:44:17 -0400 (0:00:00.123) 0:01:27.959 ****** 28983 1726883057.96187: entering _queue_task() for managed_node2/copy 28983 1726883057.96716: worker is 1 (out of 1 available) 28983 1726883057.96729: exiting _queue_task() for managed_node2/copy 28983 1726883057.96745: done queuing things up, now waiting for results queue to drain 28983 1726883057.96747: waiting for pending results... 28983 1726883057.97156: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 28983 1726883057.97210: in run() - task 0affe814-3a2d-b16d-c0a7-00000000146e 28983 1726883057.97233: variable 'ansible_search_path' from source: unknown 28983 1726883057.97251: variable 'ansible_search_path' from source: unknown 28983 1726883057.97308: calling self._execute() 28983 1726883057.97444: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883057.97459: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883057.97484: variable 'omit' from source: magic vars 28983 1726883057.98214: variable 'ansible_distribution_major_version' from source: facts 28983 1726883057.98238: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883057.98600: variable 'network_provider' from source: set_fact 28983 1726883057.98697: Evaluated conditional (network_provider == "initscripts"): False 28983 1726883057.98701: when evaluation is False, skipping this task 28983 1726883057.98708: _execute() done 28983 1726883057.98711: dumping result to json 28983 1726883057.98714: done dumping result, returning 28983 1726883057.98718: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affe814-3a2d-b16d-c0a7-00000000146e] 28983 1726883057.98720: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000146e skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 28983 1726883057.99197: no more pending results, returning what we have 28983 1726883057.99202: results queue empty 28983 1726883057.99203: checking for any_errors_fatal 28983 1726883057.99212: done checking for any_errors_fatal 28983 1726883057.99213: checking for max_fail_percentage 28983 1726883057.99215: done checking for max_fail_percentage 28983 1726883057.99217: checking to see if all hosts have failed and the running result is not ok 28983 1726883057.99219: done checking to see if all hosts have failed 28983 1726883057.99220: getting the remaining hosts for this loop 28983 1726883057.99222: done getting the remaining hosts for this loop 28983 1726883057.99228: getting the next task for host managed_node2 28983 1726883057.99240: done getting next task for host managed_node2 28983 1726883057.99467: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 28983 1726883057.99474: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883057.99500: getting variables 28983 1726883057.99502: in VariableManager get_vars() 28983 1726883057.99656: Calling all_inventory to load vars for managed_node2 28983 1726883057.99660: Calling groups_inventory to load vars for managed_node2 28983 1726883057.99663: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883057.99670: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000146e 28983 1726883057.99673: WORKER PROCESS EXITING 28983 1726883057.99682: Calling all_plugins_play to load vars for managed_node2 28983 1726883057.99686: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883057.99691: Calling groups_plugins_play to load vars for managed_node2 28983 1726883058.04663: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883058.12107: done with get_vars() 28983 1726883058.12153: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:44:18 -0400 (0:00:00.161) 0:01:28.121 ****** 28983 1726883058.12467: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 28983 1726883058.13214: worker is 1 (out of 1 available) 28983 1726883058.13378: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 28983 1726883058.13391: done queuing things up, now waiting for results queue to drain 28983 1726883058.13394: waiting for pending results... 28983 1726883058.13892: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 28983 1726883058.14342: in run() - task 0affe814-3a2d-b16d-c0a7-00000000146f 28983 1726883058.14646: variable 'ansible_search_path' from source: unknown 28983 1726883058.14650: variable 'ansible_search_path' from source: unknown 28983 1726883058.14653: calling self._execute() 28983 1726883058.14656: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883058.14849: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883058.14883: variable 'omit' from source: magic vars 28983 1726883058.15880: variable 'ansible_distribution_major_version' from source: facts 28983 1726883058.15976: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883058.15988: variable 'omit' from source: magic vars 28983 1726883058.16199: variable 'omit' from source: magic vars 28983 1726883058.16655: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883058.23269: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883058.23483: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883058.23536: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883058.23740: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883058.23744: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883058.23908: variable 'network_provider' from source: set_fact 28983 1726883058.24264: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883058.24365: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883058.24407: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883058.24600: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883058.24626: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883058.24796: variable 'omit' from source: magic vars 28983 1726883058.25101: variable 'omit' from source: magic vars 28983 1726883058.25382: variable 'network_connections' from source: include params 28983 1726883058.25461: variable 'interface' from source: play vars 28983 1726883058.25625: variable 'interface' from source: play vars 28983 1726883058.26039: variable 'omit' from source: magic vars 28983 1726883058.26050: variable '__lsr_ansible_managed' from source: task vars 28983 1726883058.26133: variable '__lsr_ansible_managed' from source: task vars 28983 1726883058.26365: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 28983 1726883058.26676: Loaded config def from plugin (lookup/template) 28983 1726883058.26679: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 28983 1726883058.26712: File lookup term: get_ansible_managed.j2 28983 1726883058.26720: variable 'ansible_search_path' from source: unknown 28983 1726883058.26726: evaluation_path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 28983 1726883058.26745: search_path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 28983 1726883058.26853: variable 'ansible_search_path' from source: unknown 28983 1726883058.41150: variable 'ansible_managed' from source: unknown 28983 1726883058.41794: variable 'omit' from source: magic vars 28983 1726883058.41823: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883058.41857: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883058.41879: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883058.41899: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883058.41911: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883058.42052: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883058.42055: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883058.42067: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883058.42179: Set connection var ansible_connection to ssh 28983 1726883058.42193: Set connection var ansible_shell_executable to /bin/sh 28983 1726883058.42204: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883058.42215: Set connection var ansible_timeout to 10 28983 1726883058.42223: Set connection var ansible_pipelining to False 28983 1726883058.42226: Set connection var ansible_shell_type to sh 28983 1726883058.42448: variable 'ansible_shell_executable' from source: unknown 28983 1726883058.42452: variable 'ansible_connection' from source: unknown 28983 1726883058.42455: variable 'ansible_module_compression' from source: unknown 28983 1726883058.42494: variable 'ansible_shell_type' from source: unknown 28983 1726883058.42497: variable 'ansible_shell_executable' from source: unknown 28983 1726883058.42502: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883058.42504: variable 'ansible_pipelining' from source: unknown 28983 1726883058.42506: variable 'ansible_timeout' from source: unknown 28983 1726883058.42509: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883058.42807: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28983 1726883058.42818: variable 'omit' from source: magic vars 28983 1726883058.42821: starting attempt loop 28983 1726883058.42823: running the handler 28983 1726883058.42825: _low_level_execute_command(): starting 28983 1726883058.42827: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726883058.43610: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883058.43614: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883058.43616: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883058.43623: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883058.43705: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883058.45700: stdout chunk (state=3): >>>/root <<< 28983 1726883058.45704: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883058.45707: stderr chunk (state=3): >>><<< 28983 1726883058.45709: stdout chunk (state=3): >>><<< 28983 1726883058.45712: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883058.45715: _low_level_execute_command(): starting 28983 1726883058.45718: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726883058.4569705-32261-79570755398042 `" && echo ansible-tmp-1726883058.4569705-32261-79570755398042="` echo /root/.ansible/tmp/ansible-tmp-1726883058.4569705-32261-79570755398042 `" ) && sleep 0' 28983 1726883058.46691: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883058.46695: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883058.46713: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883058.46729: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883058.46743: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726883058.46752: stderr chunk (state=3): >>>debug2: match not found <<< 28983 1726883058.46762: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883058.46777: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28983 1726883058.46791: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.46.139 is address <<< 28983 1726883058.46794: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28983 1726883058.46802: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883058.46813: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883058.46826: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883058.46837: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726883058.46942: stderr chunk (state=3): >>>debug2: match found <<< 28983 1726883058.46946: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883058.46949: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883058.47145: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883058.49086: stdout chunk (state=3): >>>ansible-tmp-1726883058.4569705-32261-79570755398042=/root/.ansible/tmp/ansible-tmp-1726883058.4569705-32261-79570755398042 <<< 28983 1726883058.49199: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883058.49524: stderr chunk (state=3): >>><<< 28983 1726883058.49528: stdout chunk (state=3): >>><<< 28983 1726883058.49531: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726883058.4569705-32261-79570755398042=/root/.ansible/tmp/ansible-tmp-1726883058.4569705-32261-79570755398042 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883058.49867: variable 'ansible_module_compression' from source: unknown 28983 1726883058.49870: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 28983 1726883058.49876: variable 'ansible_facts' from source: unknown 28983 1726883058.49912: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726883058.4569705-32261-79570755398042/AnsiballZ_network_connections.py 28983 1726883058.50058: Sending initial data 28983 1726883058.50062: Sent initial data (167 bytes) 28983 1726883058.50957: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883058.51154: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883058.51198: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883058.51288: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883058.51297: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883058.51500: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883058.51536: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883058.53230: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 <<< 28983 1726883058.53239: stderr chunk (state=3): >>>debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726883058.53317: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726883058.53352: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpkfe9s9ln /root/.ansible/tmp/ansible-tmp-1726883058.4569705-32261-79570755398042/AnsiballZ_network_connections.py <<< 28983 1726883058.53406: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726883058.4569705-32261-79570755398042/AnsiballZ_network_connections.py" <<< 28983 1726883058.53845: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpkfe9s9ln" to remote "/root/.ansible/tmp/ansible-tmp-1726883058.4569705-32261-79570755398042/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726883058.4569705-32261-79570755398042/AnsiballZ_network_connections.py" <<< 28983 1726883058.56347: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883058.56431: stderr chunk (state=3): >>><<< 28983 1726883058.56475: stdout chunk (state=3): >>><<< 28983 1726883058.56505: done transferring module to remote 28983 1726883058.56679: _low_level_execute_command(): starting 28983 1726883058.56690: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726883058.4569705-32261-79570755398042/ /root/.ansible/tmp/ansible-tmp-1726883058.4569705-32261-79570755398042/AnsiballZ_network_connections.py && sleep 0' 28983 1726883058.57430: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883058.57452: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883058.57460: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28983 1726883058.57491: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found <<< 28983 1726883058.57566: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883058.57588: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883058.57683: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883058.59653: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883058.59878: stderr chunk (state=3): >>><<< 28983 1726883058.59882: stdout chunk (state=3): >>><<< 28983 1726883058.59901: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883058.59904: _low_level_execute_command(): starting 28983 1726883058.59911: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726883058.4569705-32261-79570755398042/AnsiballZ_network_connections.py && sleep 0' 28983 1726883058.60966: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883058.61031: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883058.61037: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883058.61040: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883058.61042: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found <<< 28983 1726883058.61051: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883058.61106: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883058.61149: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883058.61280: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883058.93087: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_8jarf88h/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_8jarf88h/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on statebr/3ac79eb6-77ee-484f-9752-0ce3ea88e423: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 28983 1726883058.95090: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883058.95153: stderr chunk (state=3): >>>Shared connection to 10.31.46.139 closed. <<< 28983 1726883058.95352: stderr chunk (state=3): >>><<< 28983 1726883058.95356: stdout chunk (state=3): >>><<< 28983 1726883058.95359: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_8jarf88h/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_8jarf88h/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on statebr/3ac79eb6-77ee-484f-9752-0ce3ea88e423: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726883058.95427: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'persistent_state': 'absent'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726883058.4569705-32261-79570755398042/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726883058.95568: _low_level_execute_command(): starting 28983 1726883058.95575: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726883058.4569705-32261-79570755398042/ > /dev/null 2>&1 && sleep 0' 28983 1726883058.96593: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883058.96655: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883058.96883: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883058.96911: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883058.98883: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883058.98977: stderr chunk (state=3): >>><<< 28983 1726883058.98996: stdout chunk (state=3): >>><<< 28983 1726883058.99057: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883058.99065: handler run complete 28983 1726883058.99122: attempt loop complete, returning result 28983 1726883058.99166: _execute() done 28983 1726883058.99169: dumping result to json 28983 1726883058.99171: done dumping result, returning 28983 1726883058.99176: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affe814-3a2d-b16d-c0a7-00000000146f] 28983 1726883058.99181: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000146f 28983 1726883058.99400: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000146f 28983 1726883058.99404: WORKER PROCESS EXITING changed: [managed_node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 28983 1726883058.99791: no more pending results, returning what we have 28983 1726883058.99795: results queue empty 28983 1726883058.99796: checking for any_errors_fatal 28983 1726883058.99804: done checking for any_errors_fatal 28983 1726883058.99805: checking for max_fail_percentage 28983 1726883058.99807: done checking for max_fail_percentage 28983 1726883058.99809: checking to see if all hosts have failed and the running result is not ok 28983 1726883058.99810: done checking to see if all hosts have failed 28983 1726883058.99810: getting the remaining hosts for this loop 28983 1726883058.99813: done getting the remaining hosts for this loop 28983 1726883058.99818: getting the next task for host managed_node2 28983 1726883058.99832: done getting next task for host managed_node2 28983 1726883058.99956: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 28983 1726883058.99962: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883058.99979: getting variables 28983 1726883058.99981: in VariableManager get_vars() 28983 1726883059.00027: Calling all_inventory to load vars for managed_node2 28983 1726883059.00031: Calling groups_inventory to load vars for managed_node2 28983 1726883059.00037: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883059.00047: Calling all_plugins_play to load vars for managed_node2 28983 1726883059.00050: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883059.00144: Calling groups_plugins_play to load vars for managed_node2 28983 1726883059.02824: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883059.04522: done with get_vars() 28983 1726883059.04560: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:44:19 -0400 (0:00:00.923) 0:01:29.044 ****** 28983 1726883059.04672: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 28983 1726883059.05145: worker is 1 (out of 1 available) 28983 1726883059.05160: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 28983 1726883059.05180: done queuing things up, now waiting for results queue to drain 28983 1726883059.05183: waiting for pending results... 28983 1726883059.05808: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 28983 1726883059.06039: in run() - task 0affe814-3a2d-b16d-c0a7-000000001470 28983 1726883059.06043: variable 'ansible_search_path' from source: unknown 28983 1726883059.06047: variable 'ansible_search_path' from source: unknown 28983 1726883059.06224: calling self._execute() 28983 1726883059.06229: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883059.06235: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883059.06238: variable 'omit' from source: magic vars 28983 1726883059.06608: variable 'ansible_distribution_major_version' from source: facts 28983 1726883059.06617: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883059.06844: variable 'network_state' from source: role '' defaults 28983 1726883059.06848: Evaluated conditional (network_state != {}): False 28983 1726883059.06851: when evaluation is False, skipping this task 28983 1726883059.06856: _execute() done 28983 1726883059.06858: dumping result to json 28983 1726883059.06860: done dumping result, returning 28983 1726883059.06862: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [0affe814-3a2d-b16d-c0a7-000000001470] 28983 1726883059.06864: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001470 28983 1726883059.06936: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001470 28983 1726883059.06940: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28983 1726883059.07003: no more pending results, returning what we have 28983 1726883059.07006: results queue empty 28983 1726883059.07007: checking for any_errors_fatal 28983 1726883059.07015: done checking for any_errors_fatal 28983 1726883059.07016: checking for max_fail_percentage 28983 1726883059.07018: done checking for max_fail_percentage 28983 1726883059.07019: checking to see if all hosts have failed and the running result is not ok 28983 1726883059.07020: done checking to see if all hosts have failed 28983 1726883059.07021: getting the remaining hosts for this loop 28983 1726883059.07023: done getting the remaining hosts for this loop 28983 1726883059.07027: getting the next task for host managed_node2 28983 1726883059.07036: done getting next task for host managed_node2 28983 1726883059.07041: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 28983 1726883059.07046: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883059.07070: getting variables 28983 1726883059.07071: in VariableManager get_vars() 28983 1726883059.07110: Calling all_inventory to load vars for managed_node2 28983 1726883059.07113: Calling groups_inventory to load vars for managed_node2 28983 1726883059.07116: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883059.07125: Calling all_plugins_play to load vars for managed_node2 28983 1726883059.07128: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883059.07132: Calling groups_plugins_play to load vars for managed_node2 28983 1726883059.08894: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883059.10514: done with get_vars() 28983 1726883059.10540: done getting variables 28983 1726883059.10593: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:44:19 -0400 (0:00:00.059) 0:01:29.104 ****** 28983 1726883059.10624: entering _queue_task() for managed_node2/debug 28983 1726883059.10892: worker is 1 (out of 1 available) 28983 1726883059.10906: exiting _queue_task() for managed_node2/debug 28983 1726883059.10922: done queuing things up, now waiting for results queue to drain 28983 1726883059.10925: waiting for pending results... 28983 1726883059.11354: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 28983 1726883059.11398: in run() - task 0affe814-3a2d-b16d-c0a7-000000001471 28983 1726883059.11417: variable 'ansible_search_path' from source: unknown 28983 1726883059.11420: variable 'ansible_search_path' from source: unknown 28983 1726883059.11464: calling self._execute() 28983 1726883059.11604: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883059.11640: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883059.11645: variable 'omit' from source: magic vars 28983 1726883059.12107: variable 'ansible_distribution_major_version' from source: facts 28983 1726883059.12118: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883059.12131: variable 'omit' from source: magic vars 28983 1726883059.12189: variable 'omit' from source: magic vars 28983 1726883059.12218: variable 'omit' from source: magic vars 28983 1726883059.12261: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883059.12294: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883059.12312: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883059.12327: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883059.12340: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883059.12374: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883059.12378: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883059.12381: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883059.12461: Set connection var ansible_connection to ssh 28983 1726883059.12477: Set connection var ansible_shell_executable to /bin/sh 28983 1726883059.12485: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883059.12494: Set connection var ansible_timeout to 10 28983 1726883059.12501: Set connection var ansible_pipelining to False 28983 1726883059.12503: Set connection var ansible_shell_type to sh 28983 1726883059.12530: variable 'ansible_shell_executable' from source: unknown 28983 1726883059.12536: variable 'ansible_connection' from source: unknown 28983 1726883059.12540: variable 'ansible_module_compression' from source: unknown 28983 1726883059.12542: variable 'ansible_shell_type' from source: unknown 28983 1726883059.12544: variable 'ansible_shell_executable' from source: unknown 28983 1726883059.12550: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883059.12554: variable 'ansible_pipelining' from source: unknown 28983 1726883059.12558: variable 'ansible_timeout' from source: unknown 28983 1726883059.12564: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883059.12684: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883059.12697: variable 'omit' from source: magic vars 28983 1726883059.12700: starting attempt loop 28983 1726883059.12705: running the handler 28983 1726883059.12821: variable '__network_connections_result' from source: set_fact 28983 1726883059.12870: handler run complete 28983 1726883059.12888: attempt loop complete, returning result 28983 1726883059.12891: _execute() done 28983 1726883059.12895: dumping result to json 28983 1726883059.12898: done dumping result, returning 28983 1726883059.12912: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affe814-3a2d-b16d-c0a7-000000001471] 28983 1726883059.12915: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001471 28983 1726883059.13015: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001471 28983 1726883059.13022: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result.stderr_lines": [ "" ] } 28983 1726883059.13109: no more pending results, returning what we have 28983 1726883059.13113: results queue empty 28983 1726883059.13114: checking for any_errors_fatal 28983 1726883059.13121: done checking for any_errors_fatal 28983 1726883059.13122: checking for max_fail_percentage 28983 1726883059.13124: done checking for max_fail_percentage 28983 1726883059.13125: checking to see if all hosts have failed and the running result is not ok 28983 1726883059.13126: done checking to see if all hosts have failed 28983 1726883059.13128: getting the remaining hosts for this loop 28983 1726883059.13130: done getting the remaining hosts for this loop 28983 1726883059.13134: getting the next task for host managed_node2 28983 1726883059.13143: done getting next task for host managed_node2 28983 1726883059.13146: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 28983 1726883059.13152: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883059.13163: getting variables 28983 1726883059.13165: in VariableManager get_vars() 28983 1726883059.13205: Calling all_inventory to load vars for managed_node2 28983 1726883059.13208: Calling groups_inventory to load vars for managed_node2 28983 1726883059.13211: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883059.13219: Calling all_plugins_play to load vars for managed_node2 28983 1726883059.13222: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883059.13226: Calling groups_plugins_play to load vars for managed_node2 28983 1726883059.14487: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883059.16212: done with get_vars() 28983 1726883059.16237: done getting variables 28983 1726883059.16288: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:44:19 -0400 (0:00:00.056) 0:01:29.160 ****** 28983 1726883059.16320: entering _queue_task() for managed_node2/debug 28983 1726883059.16566: worker is 1 (out of 1 available) 28983 1726883059.16579: exiting _queue_task() for managed_node2/debug 28983 1726883059.16592: done queuing things up, now waiting for results queue to drain 28983 1726883059.16594: waiting for pending results... 28983 1726883059.16803: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 28983 1726883059.16924: in run() - task 0affe814-3a2d-b16d-c0a7-000000001472 28983 1726883059.16938: variable 'ansible_search_path' from source: unknown 28983 1726883059.16942: variable 'ansible_search_path' from source: unknown 28983 1726883059.16980: calling self._execute() 28983 1726883059.17069: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883059.17078: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883059.17088: variable 'omit' from source: magic vars 28983 1726883059.17422: variable 'ansible_distribution_major_version' from source: facts 28983 1726883059.17436: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883059.17442: variable 'omit' from source: magic vars 28983 1726883059.17502: variable 'omit' from source: magic vars 28983 1726883059.17531: variable 'omit' from source: magic vars 28983 1726883059.17569: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883059.17606: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883059.17624: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883059.17641: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883059.17651: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883059.17682: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883059.17686: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883059.17693: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883059.17773: Set connection var ansible_connection to ssh 28983 1726883059.17785: Set connection var ansible_shell_executable to /bin/sh 28983 1726883059.17793: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883059.17803: Set connection var ansible_timeout to 10 28983 1726883059.17810: Set connection var ansible_pipelining to False 28983 1726883059.17814: Set connection var ansible_shell_type to sh 28983 1726883059.17839: variable 'ansible_shell_executable' from source: unknown 28983 1726883059.17843: variable 'ansible_connection' from source: unknown 28983 1726883059.17846: variable 'ansible_module_compression' from source: unknown 28983 1726883059.17848: variable 'ansible_shell_type' from source: unknown 28983 1726883059.17853: variable 'ansible_shell_executable' from source: unknown 28983 1726883059.17856: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883059.17862: variable 'ansible_pipelining' from source: unknown 28983 1726883059.17865: variable 'ansible_timeout' from source: unknown 28983 1726883059.17870: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883059.17995: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883059.18006: variable 'omit' from source: magic vars 28983 1726883059.18012: starting attempt loop 28983 1726883059.18016: running the handler 28983 1726883059.18065: variable '__network_connections_result' from source: set_fact 28983 1726883059.18129: variable '__network_connections_result' from source: set_fact 28983 1726883059.18226: handler run complete 28983 1726883059.18251: attempt loop complete, returning result 28983 1726883059.18254: _execute() done 28983 1726883059.18257: dumping result to json 28983 1726883059.18269: done dumping result, returning 28983 1726883059.18272: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affe814-3a2d-b16d-c0a7-000000001472] 28983 1726883059.18280: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001472 28983 1726883059.18378: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001472 28983 1726883059.18381: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 28983 1726883059.18479: no more pending results, returning what we have 28983 1726883059.18483: results queue empty 28983 1726883059.18484: checking for any_errors_fatal 28983 1726883059.18490: done checking for any_errors_fatal 28983 1726883059.18491: checking for max_fail_percentage 28983 1726883059.18493: done checking for max_fail_percentage 28983 1726883059.18494: checking to see if all hosts have failed and the running result is not ok 28983 1726883059.18495: done checking to see if all hosts have failed 28983 1726883059.18496: getting the remaining hosts for this loop 28983 1726883059.18498: done getting the remaining hosts for this loop 28983 1726883059.18502: getting the next task for host managed_node2 28983 1726883059.18509: done getting next task for host managed_node2 28983 1726883059.18513: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 28983 1726883059.18518: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883059.18530: getting variables 28983 1726883059.18532: in VariableManager get_vars() 28983 1726883059.18578: Calling all_inventory to load vars for managed_node2 28983 1726883059.18581: Calling groups_inventory to load vars for managed_node2 28983 1726883059.18584: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883059.18593: Calling all_plugins_play to load vars for managed_node2 28983 1726883059.18596: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883059.18599: Calling groups_plugins_play to load vars for managed_node2 28983 1726883059.19842: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883059.21476: done with get_vars() 28983 1726883059.21501: done getting variables 28983 1726883059.21548: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:44:19 -0400 (0:00:00.052) 0:01:29.213 ****** 28983 1726883059.21575: entering _queue_task() for managed_node2/debug 28983 1726883059.21796: worker is 1 (out of 1 available) 28983 1726883059.21811: exiting _queue_task() for managed_node2/debug 28983 1726883059.21824: done queuing things up, now waiting for results queue to drain 28983 1726883059.21826: waiting for pending results... 28983 1726883059.22019: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 28983 1726883059.22133: in run() - task 0affe814-3a2d-b16d-c0a7-000000001473 28983 1726883059.22146: variable 'ansible_search_path' from source: unknown 28983 1726883059.22149: variable 'ansible_search_path' from source: unknown 28983 1726883059.22188: calling self._execute() 28983 1726883059.22272: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883059.22283: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883059.22294: variable 'omit' from source: magic vars 28983 1726883059.22611: variable 'ansible_distribution_major_version' from source: facts 28983 1726883059.22623: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883059.22729: variable 'network_state' from source: role '' defaults 28983 1726883059.22738: Evaluated conditional (network_state != {}): False 28983 1726883059.22742: when evaluation is False, skipping this task 28983 1726883059.22745: _execute() done 28983 1726883059.22750: dumping result to json 28983 1726883059.22754: done dumping result, returning 28983 1726883059.22762: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affe814-3a2d-b16d-c0a7-000000001473] 28983 1726883059.22768: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001473 28983 1726883059.22870: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001473 28983 1726883059.22873: WORKER PROCESS EXITING skipping: [managed_node2] => { "false_condition": "network_state != {}" } 28983 1726883059.22924: no more pending results, returning what we have 28983 1726883059.22928: results queue empty 28983 1726883059.22929: checking for any_errors_fatal 28983 1726883059.22937: done checking for any_errors_fatal 28983 1726883059.22938: checking for max_fail_percentage 28983 1726883059.22939: done checking for max_fail_percentage 28983 1726883059.22940: checking to see if all hosts have failed and the running result is not ok 28983 1726883059.22941: done checking to see if all hosts have failed 28983 1726883059.22942: getting the remaining hosts for this loop 28983 1726883059.22944: done getting the remaining hosts for this loop 28983 1726883059.22948: getting the next task for host managed_node2 28983 1726883059.22956: done getting next task for host managed_node2 28983 1726883059.22960: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 28983 1726883059.22966: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883059.22986: getting variables 28983 1726883059.22988: in VariableManager get_vars() 28983 1726883059.23023: Calling all_inventory to load vars for managed_node2 28983 1726883059.23026: Calling groups_inventory to load vars for managed_node2 28983 1726883059.23028: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883059.23043: Calling all_plugins_play to load vars for managed_node2 28983 1726883059.23046: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883059.23049: Calling groups_plugins_play to load vars for managed_node2 28983 1726883059.24405: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883059.26014: done with get_vars() 28983 1726883059.26038: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:44:19 -0400 (0:00:00.045) 0:01:29.258 ****** 28983 1726883059.26118: entering _queue_task() for managed_node2/ping 28983 1726883059.26351: worker is 1 (out of 1 available) 28983 1726883059.26366: exiting _queue_task() for managed_node2/ping 28983 1726883059.26381: done queuing things up, now waiting for results queue to drain 28983 1726883059.26383: waiting for pending results... 28983 1726883059.26859: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 28983 1726883059.26888: in run() - task 0affe814-3a2d-b16d-c0a7-000000001474 28983 1726883059.26905: variable 'ansible_search_path' from source: unknown 28983 1726883059.26910: variable 'ansible_search_path' from source: unknown 28983 1726883059.26951: calling self._execute() 28983 1726883059.27066: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883059.27106: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883059.27110: variable 'omit' from source: magic vars 28983 1726883059.27565: variable 'ansible_distribution_major_version' from source: facts 28983 1726883059.27582: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883059.27639: variable 'omit' from source: magic vars 28983 1726883059.27684: variable 'omit' from source: magic vars 28983 1726883059.27725: variable 'omit' from source: magic vars 28983 1726883059.27775: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883059.27820: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883059.27845: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883059.27873: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883059.27887: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883059.27925: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883059.27929: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883059.27937: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883059.28091: Set connection var ansible_connection to ssh 28983 1726883059.28099: Set connection var ansible_shell_executable to /bin/sh 28983 1726883059.28102: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883059.28113: Set connection var ansible_timeout to 10 28983 1726883059.28121: Set connection var ansible_pipelining to False 28983 1726883059.28124: Set connection var ansible_shell_type to sh 28983 1726883059.28200: variable 'ansible_shell_executable' from source: unknown 28983 1726883059.28209: variable 'ansible_connection' from source: unknown 28983 1726883059.28212: variable 'ansible_module_compression' from source: unknown 28983 1726883059.28215: variable 'ansible_shell_type' from source: unknown 28983 1726883059.28217: variable 'ansible_shell_executable' from source: unknown 28983 1726883059.28219: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883059.28222: variable 'ansible_pipelining' from source: unknown 28983 1726883059.28224: variable 'ansible_timeout' from source: unknown 28983 1726883059.28226: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883059.28589: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28983 1726883059.28594: variable 'omit' from source: magic vars 28983 1726883059.28597: starting attempt loop 28983 1726883059.28599: running the handler 28983 1726883059.28601: _low_level_execute_command(): starting 28983 1726883059.28603: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726883059.29343: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883059.29349: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883059.29352: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883059.29355: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883059.29358: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726883059.29360: stderr chunk (state=3): >>>debug2: match not found <<< 28983 1726883059.29362: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883059.29365: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28983 1726883059.29453: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883059.29482: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883059.29493: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883059.29514: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883059.29627: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883059.31391: stdout chunk (state=3): >>>/root <<< 28983 1726883059.31503: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883059.31548: stderr chunk (state=3): >>><<< 28983 1726883059.31552: stdout chunk (state=3): >>><<< 28983 1726883059.31572: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883059.31587: _low_level_execute_command(): starting 28983 1726883059.31593: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726883059.31575-32321-175968155359525 `" && echo ansible-tmp-1726883059.31575-32321-175968155359525="` echo /root/.ansible/tmp/ansible-tmp-1726883059.31575-32321-175968155359525 `" ) && sleep 0' 28983 1726883059.32006: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883059.32042: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726883059.32053: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883059.32058: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration <<< 28983 1726883059.32060: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726883059.32063: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883059.32108: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883059.32116: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883059.32192: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883059.34191: stdout chunk (state=3): >>>ansible-tmp-1726883059.31575-32321-175968155359525=/root/.ansible/tmp/ansible-tmp-1726883059.31575-32321-175968155359525 <<< 28983 1726883059.34307: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883059.34354: stderr chunk (state=3): >>><<< 28983 1726883059.34357: stdout chunk (state=3): >>><<< 28983 1726883059.34374: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726883059.31575-32321-175968155359525=/root/.ansible/tmp/ansible-tmp-1726883059.31575-32321-175968155359525 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883059.34408: variable 'ansible_module_compression' from source: unknown 28983 1726883059.34442: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 28983 1726883059.34477: variable 'ansible_facts' from source: unknown 28983 1726883059.34529: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726883059.31575-32321-175968155359525/AnsiballZ_ping.py 28983 1726883059.34637: Sending initial data 28983 1726883059.34641: Sent initial data (151 bytes) 28983 1726883059.35086: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883059.35089: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883059.35091: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration <<< 28983 1726883059.35096: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883059.35151: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883059.35158: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883059.35227: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883059.36859: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 28983 1726883059.36864: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726883059.36921: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726883059.36991: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmp8tbwafjj /root/.ansible/tmp/ansible-tmp-1726883059.31575-32321-175968155359525/AnsiballZ_ping.py <<< 28983 1726883059.36998: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726883059.31575-32321-175968155359525/AnsiballZ_ping.py" <<< 28983 1726883059.37062: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmp8tbwafjj" to remote "/root/.ansible/tmp/ansible-tmp-1726883059.31575-32321-175968155359525/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726883059.31575-32321-175968155359525/AnsiballZ_ping.py" <<< 28983 1726883059.37927: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883059.37984: stderr chunk (state=3): >>><<< 28983 1726883059.37988: stdout chunk (state=3): >>><<< 28983 1726883059.38005: done transferring module to remote 28983 1726883059.38014: _low_level_execute_command(): starting 28983 1726883059.38020: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726883059.31575-32321-175968155359525/ /root/.ansible/tmp/ansible-tmp-1726883059.31575-32321-175968155359525/AnsiballZ_ping.py && sleep 0' 28983 1726883059.38429: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883059.38465: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883059.38469: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726883059.38473: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 28983 1726883059.38476: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883059.38482: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883059.38538: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883059.38542: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883059.38613: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883059.40721: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883059.40725: stderr chunk (state=3): >>><<< 28983 1726883059.40729: stdout chunk (state=3): >>><<< 28983 1726883059.40732: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883059.40736: _low_level_execute_command(): starting 28983 1726883059.40739: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726883059.31575-32321-175968155359525/AnsiballZ_ping.py && sleep 0' 28983 1726883059.41251: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883059.41266: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883059.41285: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883059.41308: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883059.41327: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726883059.41344: stderr chunk (state=3): >>>debug2: match not found <<< 28983 1726883059.41449: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883059.41479: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883059.41583: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883059.58535: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 28983 1726883059.59962: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. <<< 28983 1726883059.60046: stderr chunk (state=3): >>><<< 28983 1726883059.60063: stdout chunk (state=3): >>><<< 28983 1726883059.60186: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726883059.60191: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726883059.31575-32321-175968155359525/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726883059.60193: _low_level_execute_command(): starting 28983 1726883059.60196: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726883059.31575-32321-175968155359525/ > /dev/null 2>&1 && sleep 0' 28983 1726883059.60882: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883059.60886: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726883059.60890: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address <<< 28983 1726883059.60892: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found <<< 28983 1726883059.60895: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883059.60978: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883059.61015: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883059.61113: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883059.63052: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883059.63104: stderr chunk (state=3): >>><<< 28983 1726883059.63106: stdout chunk (state=3): >>><<< 28983 1726883059.63128: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883059.63172: handler run complete 28983 1726883059.63176: attempt loop complete, returning result 28983 1726883059.63178: _execute() done 28983 1726883059.63180: dumping result to json 28983 1726883059.63183: done dumping result, returning 28983 1726883059.63185: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affe814-3a2d-b16d-c0a7-000000001474] 28983 1726883059.63191: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001474 28983 1726883059.63296: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001474 28983 1726883059.63299: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "ping": "pong" } 28983 1726883059.63379: no more pending results, returning what we have 28983 1726883059.63384: results queue empty 28983 1726883059.63385: checking for any_errors_fatal 28983 1726883059.63393: done checking for any_errors_fatal 28983 1726883059.63394: checking for max_fail_percentage 28983 1726883059.63396: done checking for max_fail_percentage 28983 1726883059.63397: checking to see if all hosts have failed and the running result is not ok 28983 1726883059.63398: done checking to see if all hosts have failed 28983 1726883059.63399: getting the remaining hosts for this loop 28983 1726883059.63401: done getting the remaining hosts for this loop 28983 1726883059.63406: getting the next task for host managed_node2 28983 1726883059.63417: done getting next task for host managed_node2 28983 1726883059.63420: ^ task is: TASK: meta (role_complete) 28983 1726883059.63426: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883059.63441: getting variables 28983 1726883059.63443: in VariableManager get_vars() 28983 1726883059.63490: Calling all_inventory to load vars for managed_node2 28983 1726883059.63493: Calling groups_inventory to load vars for managed_node2 28983 1726883059.63496: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883059.63506: Calling all_plugins_play to load vars for managed_node2 28983 1726883059.63509: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883059.63513: Calling groups_plugins_play to load vars for managed_node2 28983 1726883059.65484: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883059.67544: done with get_vars() 28983 1726883059.67570: done getting variables 28983 1726883059.67642: done queuing things up, now waiting for results queue to drain 28983 1726883059.67644: results queue empty 28983 1726883059.67645: checking for any_errors_fatal 28983 1726883059.67648: done checking for any_errors_fatal 28983 1726883059.67649: checking for max_fail_percentage 28983 1726883059.67650: done checking for max_fail_percentage 28983 1726883059.67650: checking to see if all hosts have failed and the running result is not ok 28983 1726883059.67651: done checking to see if all hosts have failed 28983 1726883059.67651: getting the remaining hosts for this loop 28983 1726883059.67652: done getting the remaining hosts for this loop 28983 1726883059.67654: getting the next task for host managed_node2 28983 1726883059.67661: done getting next task for host managed_node2 28983 1726883059.67663: ^ task is: TASK: Asserts 28983 1726883059.67665: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883059.67667: getting variables 28983 1726883059.67668: in VariableManager get_vars() 28983 1726883059.67679: Calling all_inventory to load vars for managed_node2 28983 1726883059.67682: Calling groups_inventory to load vars for managed_node2 28983 1726883059.67684: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883059.67688: Calling all_plugins_play to load vars for managed_node2 28983 1726883059.67690: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883059.67693: Calling groups_plugins_play to load vars for managed_node2 28983 1726883059.68805: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883059.70886: done with get_vars() 28983 1726883059.70908: done getting variables TASK [Asserts] ***************************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:36 Friday 20 September 2024 21:44:19 -0400 (0:00:00.448) 0:01:29.707 ****** 28983 1726883059.70981: entering _queue_task() for managed_node2/include_tasks 28983 1726883059.71265: worker is 1 (out of 1 available) 28983 1726883059.71283: exiting _queue_task() for managed_node2/include_tasks 28983 1726883059.71297: done queuing things up, now waiting for results queue to drain 28983 1726883059.71300: waiting for pending results... 28983 1726883059.71509: running TaskExecutor() for managed_node2/TASK: Asserts 28983 1726883059.71601: in run() - task 0affe814-3a2d-b16d-c0a7-00000000100a 28983 1726883059.71615: variable 'ansible_search_path' from source: unknown 28983 1726883059.71618: variable 'ansible_search_path' from source: unknown 28983 1726883059.71665: variable 'lsr_assert' from source: include params 28983 1726883059.71854: variable 'lsr_assert' from source: include params 28983 1726883059.71917: variable 'omit' from source: magic vars 28983 1726883059.72045: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883059.72054: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883059.72068: variable 'omit' from source: magic vars 28983 1726883059.72285: variable 'ansible_distribution_major_version' from source: facts 28983 1726883059.72299: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883059.72305: variable 'item' from source: unknown 28983 1726883059.72358: variable 'item' from source: unknown 28983 1726883059.72387: variable 'item' from source: unknown 28983 1726883059.72444: variable 'item' from source: unknown 28983 1726883059.72590: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883059.72594: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883059.72596: variable 'omit' from source: magic vars 28983 1726883059.72714: variable 'ansible_distribution_major_version' from source: facts 28983 1726883059.72720: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883059.72723: variable 'item' from source: unknown 28983 1726883059.72772: variable 'item' from source: unknown 28983 1726883059.72799: variable 'item' from source: unknown 28983 1726883059.72852: variable 'item' from source: unknown 28983 1726883059.72921: dumping result to json 28983 1726883059.72928: done dumping result, returning 28983 1726883059.72931: done running TaskExecutor() for managed_node2/TASK: Asserts [0affe814-3a2d-b16d-c0a7-00000000100a] 28983 1726883059.72934: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000100a 28983 1726883059.72975: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000100a 28983 1726883059.72979: WORKER PROCESS EXITING 28983 1726883059.73015: no more pending results, returning what we have 28983 1726883059.73020: in VariableManager get_vars() 28983 1726883059.73069: Calling all_inventory to load vars for managed_node2 28983 1726883059.73072: Calling groups_inventory to load vars for managed_node2 28983 1726883059.73076: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883059.73098: Calling all_plugins_play to load vars for managed_node2 28983 1726883059.73103: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883059.73107: Calling groups_plugins_play to load vars for managed_node2 28983 1726883059.74607: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883059.76193: done with get_vars() 28983 1726883059.76217: variable 'ansible_search_path' from source: unknown 28983 1726883059.76218: variable 'ansible_search_path' from source: unknown 28983 1726883059.76252: variable 'ansible_search_path' from source: unknown 28983 1726883059.76254: variable 'ansible_search_path' from source: unknown 28983 1726883059.76277: we have included files to process 28983 1726883059.76278: generating all_blocks data 28983 1726883059.76280: done generating all_blocks data 28983 1726883059.76285: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 28983 1726883059.76286: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 28983 1726883059.76288: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 28983 1726883059.76378: in VariableManager get_vars() 28983 1726883059.76394: done with get_vars() 28983 1726883059.76488: done processing included file 28983 1726883059.76490: iterating over new_blocks loaded from include file 28983 1726883059.76491: in VariableManager get_vars() 28983 1726883059.76503: done with get_vars() 28983 1726883059.76504: filtering new block on tags 28983 1726883059.76537: done filtering new block on tags 28983 1726883059.76539: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed_node2 => (item=tasks/assert_device_present.yml) 28983 1726883059.76544: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 28983 1726883059.76545: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 28983 1726883059.76548: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 28983 1726883059.76657: in VariableManager get_vars() 28983 1726883059.76672: done with get_vars() 28983 1726883059.76744: done processing included file 28983 1726883059.76745: iterating over new_blocks loaded from include file 28983 1726883059.76747: in VariableManager get_vars() 28983 1726883059.76762: done with get_vars() 28983 1726883059.76763: filtering new block on tags 28983 1726883059.76789: done filtering new block on tags 28983 1726883059.76790: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml for managed_node2 => (item=tasks/assert_profile_absent.yml) 28983 1726883059.76793: extending task lists for all hosts with included blocks 28983 1726883059.77958: done extending task lists 28983 1726883059.77960: done processing included files 28983 1726883059.77961: results queue empty 28983 1726883059.77962: checking for any_errors_fatal 28983 1726883059.77964: done checking for any_errors_fatal 28983 1726883059.77965: checking for max_fail_percentage 28983 1726883059.77966: done checking for max_fail_percentage 28983 1726883059.77967: checking to see if all hosts have failed and the running result is not ok 28983 1726883059.77968: done checking to see if all hosts have failed 28983 1726883059.77969: getting the remaining hosts for this loop 28983 1726883059.77971: done getting the remaining hosts for this loop 28983 1726883059.77974: getting the next task for host managed_node2 28983 1726883059.77979: done getting next task for host managed_node2 28983 1726883059.77981: ^ task is: TASK: Include the task 'get_interface_stat.yml' 28983 1726883059.77985: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883059.77992: getting variables 28983 1726883059.77994: in VariableManager get_vars() 28983 1726883059.78006: Calling all_inventory to load vars for managed_node2 28983 1726883059.78008: Calling groups_inventory to load vars for managed_node2 28983 1726883059.78012: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883059.78018: Calling all_plugins_play to load vars for managed_node2 28983 1726883059.78021: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883059.78025: Calling groups_plugins_play to load vars for managed_node2 28983 1726883059.79654: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883059.81231: done with get_vars() 28983 1726883059.81257: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 21:44:19 -0400 (0:00:00.103) 0:01:29.810 ****** 28983 1726883059.81319: entering _queue_task() for managed_node2/include_tasks 28983 1726883059.81586: worker is 1 (out of 1 available) 28983 1726883059.81599: exiting _queue_task() for managed_node2/include_tasks 28983 1726883059.81612: done queuing things up, now waiting for results queue to drain 28983 1726883059.81614: waiting for pending results... 28983 1726883059.81820: running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' 28983 1726883059.81912: in run() - task 0affe814-3a2d-b16d-c0a7-0000000015cf 28983 1726883059.81925: variable 'ansible_search_path' from source: unknown 28983 1726883059.81928: variable 'ansible_search_path' from source: unknown 28983 1726883059.81966: calling self._execute() 28983 1726883059.82054: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883059.82059: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883059.82075: variable 'omit' from source: magic vars 28983 1726883059.82548: variable 'ansible_distribution_major_version' from source: facts 28983 1726883059.82566: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883059.82569: _execute() done 28983 1726883059.82572: dumping result to json 28983 1726883059.82574: done dumping result, returning 28983 1726883059.82577: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' [0affe814-3a2d-b16d-c0a7-0000000015cf] 28983 1726883059.82579: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000015cf 28983 1726883059.82799: no more pending results, returning what we have 28983 1726883059.82804: in VariableManager get_vars() 28983 1726883059.82849: Calling all_inventory to load vars for managed_node2 28983 1726883059.82853: Calling groups_inventory to load vars for managed_node2 28983 1726883059.82856: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883059.82866: Calling all_plugins_play to load vars for managed_node2 28983 1726883059.82869: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883059.82873: Calling groups_plugins_play to load vars for managed_node2 28983 1726883059.83458: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000015cf 28983 1726883059.83462: WORKER PROCESS EXITING 28983 1726883059.85422: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883059.88554: done with get_vars() 28983 1726883059.88602: variable 'ansible_search_path' from source: unknown 28983 1726883059.88604: variable 'ansible_search_path' from source: unknown 28983 1726883059.88614: variable 'item' from source: include params 28983 1726883059.88748: variable 'item' from source: include params 28983 1726883059.88796: we have included files to process 28983 1726883059.88798: generating all_blocks data 28983 1726883059.88804: done generating all_blocks data 28983 1726883059.88806: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 28983 1726883059.88807: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 28983 1726883059.88811: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 28983 1726883059.89059: done processing included file 28983 1726883059.89061: iterating over new_blocks loaded from include file 28983 1726883059.89063: in VariableManager get_vars() 28983 1726883059.89087: done with get_vars() 28983 1726883059.89089: filtering new block on tags 28983 1726883059.89130: done filtering new block on tags 28983 1726883059.89134: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node2 28983 1726883059.89142: extending task lists for all hosts with included blocks 28983 1726883059.89379: done extending task lists 28983 1726883059.89381: done processing included files 28983 1726883059.89382: results queue empty 28983 1726883059.89383: checking for any_errors_fatal 28983 1726883059.89388: done checking for any_errors_fatal 28983 1726883059.89389: checking for max_fail_percentage 28983 1726883059.89391: done checking for max_fail_percentage 28983 1726883059.89392: checking to see if all hosts have failed and the running result is not ok 28983 1726883059.89393: done checking to see if all hosts have failed 28983 1726883059.89394: getting the remaining hosts for this loop 28983 1726883059.89395: done getting the remaining hosts for this loop 28983 1726883059.89399: getting the next task for host managed_node2 28983 1726883059.89405: done getting next task for host managed_node2 28983 1726883059.89408: ^ task is: TASK: Get stat for interface {{ interface }} 28983 1726883059.89412: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883059.89415: getting variables 28983 1726883059.89416: in VariableManager get_vars() 28983 1726883059.89429: Calling all_inventory to load vars for managed_node2 28983 1726883059.89432: Calling groups_inventory to load vars for managed_node2 28983 1726883059.89437: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883059.89444: Calling all_plugins_play to load vars for managed_node2 28983 1726883059.89455: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883059.89460: Calling groups_plugins_play to load vars for managed_node2 28983 1726883059.91664: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883059.94751: done with get_vars() 28983 1726883059.94787: done getting variables 28983 1726883059.94956: variable 'interface' from source: play vars TASK [Get stat for interface statebr] ****************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:44:19 -0400 (0:00:00.136) 0:01:29.947 ****** 28983 1726883059.94993: entering _queue_task() for managed_node2/stat 28983 1726883059.95398: worker is 1 (out of 1 available) 28983 1726883059.95412: exiting _queue_task() for managed_node2/stat 28983 1726883059.95426: done queuing things up, now waiting for results queue to drain 28983 1726883059.95429: waiting for pending results... 28983 1726883059.95852: running TaskExecutor() for managed_node2/TASK: Get stat for interface statebr 28983 1726883059.95891: in run() - task 0affe814-3a2d-b16d-c0a7-000000001647 28983 1726883059.95907: variable 'ansible_search_path' from source: unknown 28983 1726883059.95911: variable 'ansible_search_path' from source: unknown 28983 1726883059.95951: calling self._execute() 28983 1726883059.96085: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883059.96089: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883059.96092: variable 'omit' from source: magic vars 28983 1726883059.96628: variable 'ansible_distribution_major_version' from source: facts 28983 1726883059.96632: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883059.96638: variable 'omit' from source: magic vars 28983 1726883059.96640: variable 'omit' from source: magic vars 28983 1726883059.96723: variable 'interface' from source: play vars 28983 1726883059.96745: variable 'omit' from source: magic vars 28983 1726883059.96796: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883059.96840: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883059.96939: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883059.96943: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883059.96948: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883059.96951: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883059.96953: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883059.96956: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883059.97084: Set connection var ansible_connection to ssh 28983 1726883059.97087: Set connection var ansible_shell_executable to /bin/sh 28983 1726883059.97090: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883059.97139: Set connection var ansible_timeout to 10 28983 1726883059.97142: Set connection var ansible_pipelining to False 28983 1726883059.97144: Set connection var ansible_shell_type to sh 28983 1726883059.97147: variable 'ansible_shell_executable' from source: unknown 28983 1726883059.97150: variable 'ansible_connection' from source: unknown 28983 1726883059.97152: variable 'ansible_module_compression' from source: unknown 28983 1726883059.97154: variable 'ansible_shell_type' from source: unknown 28983 1726883059.97157: variable 'ansible_shell_executable' from source: unknown 28983 1726883059.97159: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883059.97162: variable 'ansible_pipelining' from source: unknown 28983 1726883059.97165: variable 'ansible_timeout' from source: unknown 28983 1726883059.97167: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883059.97401: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28983 1726883059.97510: variable 'omit' from source: magic vars 28983 1726883059.97515: starting attempt loop 28983 1726883059.97518: running the handler 28983 1726883059.97520: _low_level_execute_command(): starting 28983 1726883059.97522: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726883059.98158: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883059.98254: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883059.98293: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883059.98297: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883059.98347: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883059.98426: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883060.00260: stdout chunk (state=3): >>>/root <<< 28983 1726883060.00598: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883060.00602: stdout chunk (state=3): >>><<< 28983 1726883060.00604: stderr chunk (state=3): >>><<< 28983 1726883060.00607: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883060.00610: _low_level_execute_command(): starting 28983 1726883060.00614: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726883060.0057743-32349-220787933679617 `" && echo ansible-tmp-1726883060.0057743-32349-220787933679617="` echo /root/.ansible/tmp/ansible-tmp-1726883060.0057743-32349-220787933679617 `" ) && sleep 0' 28983 1726883060.01249: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883060.01259: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883060.01271: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883060.01289: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883060.01313: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726883060.01321: stderr chunk (state=3): >>>debug2: match not found <<< 28983 1726883060.01331: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883060.01511: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28983 1726883060.01519: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883060.01522: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883060.01525: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883060.01527: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883060.01582: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883060.03629: stdout chunk (state=3): >>>ansible-tmp-1726883060.0057743-32349-220787933679617=/root/.ansible/tmp/ansible-tmp-1726883060.0057743-32349-220787933679617 <<< 28983 1726883060.03839: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883060.03842: stdout chunk (state=3): >>><<< 28983 1726883060.03845: stderr chunk (state=3): >>><<< 28983 1726883060.04042: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726883060.0057743-32349-220787933679617=/root/.ansible/tmp/ansible-tmp-1726883060.0057743-32349-220787933679617 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883060.04046: variable 'ansible_module_compression' from source: unknown 28983 1726883060.04048: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 28983 1726883060.04051: variable 'ansible_facts' from source: unknown 28983 1726883060.04153: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726883060.0057743-32349-220787933679617/AnsiballZ_stat.py 28983 1726883060.04431: Sending initial data 28983 1726883060.04439: Sent initial data (153 bytes) 28983 1726883060.04958: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883060.04969: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883060.04984: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883060.05055: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883060.05101: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883060.05113: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883060.05123: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883060.05241: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883060.06903: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726883060.06980: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726883060.07058: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpoy_o55u2 /root/.ansible/tmp/ansible-tmp-1726883060.0057743-32349-220787933679617/AnsiballZ_stat.py <<< 28983 1726883060.07081: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726883060.0057743-32349-220787933679617/AnsiballZ_stat.py" <<< 28983 1726883060.07170: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpoy_o55u2" to remote "/root/.ansible/tmp/ansible-tmp-1726883060.0057743-32349-220787933679617/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726883060.0057743-32349-220787933679617/AnsiballZ_stat.py" <<< 28983 1726883060.08493: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883060.08497: stderr chunk (state=3): >>><<< 28983 1726883060.08500: stdout chunk (state=3): >>><<< 28983 1726883060.08526: done transferring module to remote 28983 1726883060.08539: _low_level_execute_command(): starting 28983 1726883060.08544: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726883060.0057743-32349-220787933679617/ /root/.ansible/tmp/ansible-tmp-1726883060.0057743-32349-220787933679617/AnsiballZ_stat.py && sleep 0' 28983 1726883060.09220: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883060.09223: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883060.09225: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883060.09228: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883060.09230: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726883060.09233: stderr chunk (state=3): >>>debug2: match not found <<< 28983 1726883060.09238: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883060.09245: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28983 1726883060.09248: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.46.139 is address <<< 28983 1726883060.09250: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28983 1726883060.09262: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883060.09274: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883060.09292: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883060.09302: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726883060.09310: stderr chunk (state=3): >>>debug2: match found <<< 28983 1726883060.09328: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883060.09401: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883060.09414: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883060.09424: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883060.09537: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883060.11546: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883060.11626: stderr chunk (state=3): >>><<< 28983 1726883060.11629: stdout chunk (state=3): >>><<< 28983 1726883060.11663: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883060.11755: _low_level_execute_command(): starting 28983 1726883060.11759: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726883060.0057743-32349-220787933679617/AnsiballZ_stat.py && sleep 0' 28983 1726883060.12379: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883060.12409: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883060.12494: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883060.12513: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883060.12567: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883060.12594: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883060.12610: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883060.12743: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883060.30069: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/statebr", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 38274, "dev": 23, "nlink": 1, "atime": 1726883039.2194068, "mtime": 1726883039.2194068, "ctime": 1726883039.2194068, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/statebr", "lnk_target": "../../devices/virtual/net/statebr", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 28983 1726883060.31697: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. <<< 28983 1726883060.31719: stderr chunk (state=3): >>><<< 28983 1726883060.31735: stdout chunk (state=3): >>><<< 28983 1726883060.31870: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/statebr", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 38274, "dev": 23, "nlink": 1, "atime": 1726883039.2194068, "mtime": 1726883039.2194068, "ctime": 1726883039.2194068, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/statebr", "lnk_target": "../../devices/virtual/net/statebr", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726883060.32090: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726883060.0057743-32349-220787933679617/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726883060.32094: _low_level_execute_command(): starting 28983 1726883060.32097: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726883060.0057743-32349-220787933679617/ > /dev/null 2>&1 && sleep 0' 28983 1726883060.32971: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883060.32991: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883060.33427: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883060.33472: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883060.33551: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883060.35575: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883060.35650: stderr chunk (state=3): >>><<< 28983 1726883060.35861: stdout chunk (state=3): >>><<< 28983 1726883060.35866: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883060.35870: handler run complete 28983 1726883060.35872: attempt loop complete, returning result 28983 1726883060.35874: _execute() done 28983 1726883060.35876: dumping result to json 28983 1726883060.36217: done dumping result, returning 28983 1726883060.36220: done running TaskExecutor() for managed_node2/TASK: Get stat for interface statebr [0affe814-3a2d-b16d-c0a7-000000001647] 28983 1726883060.36223: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001647 28983 1726883060.36311: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001647 ok: [managed_node2] => { "changed": false, "stat": { "atime": 1726883039.2194068, "block_size": 4096, "blocks": 0, "ctime": 1726883039.2194068, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 38274, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/statebr", "lnk_target": "../../devices/virtual/net/statebr", "mode": "0777", "mtime": 1726883039.2194068, "nlink": 1, "path": "/sys/class/net/statebr", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 28983 1726883060.36576: no more pending results, returning what we have 28983 1726883060.36580: results queue empty 28983 1726883060.36581: checking for any_errors_fatal 28983 1726883060.36584: done checking for any_errors_fatal 28983 1726883060.36585: checking for max_fail_percentage 28983 1726883060.36588: done checking for max_fail_percentage 28983 1726883060.36589: checking to see if all hosts have failed and the running result is not ok 28983 1726883060.36590: done checking to see if all hosts have failed 28983 1726883060.36591: getting the remaining hosts for this loop 28983 1726883060.36594: done getting the remaining hosts for this loop 28983 1726883060.36600: getting the next task for host managed_node2 28983 1726883060.36613: done getting next task for host managed_node2 28983 1726883060.36616: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 28983 1726883060.36621: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883060.36627: getting variables 28983 1726883060.36629: in VariableManager get_vars() 28983 1726883060.36946: Calling all_inventory to load vars for managed_node2 28983 1726883060.36950: Calling groups_inventory to load vars for managed_node2 28983 1726883060.36955: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883060.37013: Calling all_plugins_play to load vars for managed_node2 28983 1726883060.37019: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883060.37025: Calling groups_plugins_play to load vars for managed_node2 28983 1726883060.37741: WORKER PROCESS EXITING 28983 1726883060.39855: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883060.43148: done with get_vars() 28983 1726883060.43186: done getting variables 28983 1726883060.43273: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 28983 1726883060.43428: variable 'interface' from source: play vars TASK [Assert that the interface is present - 'statebr'] ************************ task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 21:44:20 -0400 (0:00:00.484) 0:01:30.432 ****** 28983 1726883060.43481: entering _queue_task() for managed_node2/assert 28983 1726883060.43905: worker is 1 (out of 1 available) 28983 1726883060.43918: exiting _queue_task() for managed_node2/assert 28983 1726883060.43932: done queuing things up, now waiting for results queue to drain 28983 1726883060.44144: waiting for pending results... 28983 1726883060.44263: running TaskExecutor() for managed_node2/TASK: Assert that the interface is present - 'statebr' 28983 1726883060.44411: in run() - task 0affe814-3a2d-b16d-c0a7-0000000015d0 28983 1726883060.44436: variable 'ansible_search_path' from source: unknown 28983 1726883060.44446: variable 'ansible_search_path' from source: unknown 28983 1726883060.44504: calling self._execute() 28983 1726883060.44632: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883060.44647: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883060.44664: variable 'omit' from source: magic vars 28983 1726883060.45159: variable 'ansible_distribution_major_version' from source: facts 28983 1726883060.45179: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883060.45190: variable 'omit' from source: magic vars 28983 1726883060.45343: variable 'omit' from source: magic vars 28983 1726883060.45391: variable 'interface' from source: play vars 28983 1726883060.45420: variable 'omit' from source: magic vars 28983 1726883060.45485: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883060.45535: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883060.45579: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883060.45606: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883060.45625: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883060.45683: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883060.45697: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883060.45707: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883060.45841: Set connection var ansible_connection to ssh 28983 1726883060.45859: Set connection var ansible_shell_executable to /bin/sh 28983 1726883060.45875: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883060.45940: Set connection var ansible_timeout to 10 28983 1726883060.45943: Set connection var ansible_pipelining to False 28983 1726883060.45946: Set connection var ansible_shell_type to sh 28983 1726883060.45950: variable 'ansible_shell_executable' from source: unknown 28983 1726883060.45958: variable 'ansible_connection' from source: unknown 28983 1726883060.45965: variable 'ansible_module_compression' from source: unknown 28983 1726883060.45972: variable 'ansible_shell_type' from source: unknown 28983 1726883060.45979: variable 'ansible_shell_executable' from source: unknown 28983 1726883060.45986: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883060.46001: variable 'ansible_pipelining' from source: unknown 28983 1726883060.46046: variable 'ansible_timeout' from source: unknown 28983 1726883060.46049: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883060.46222: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883060.46246: variable 'omit' from source: magic vars 28983 1726883060.46259: starting attempt loop 28983 1726883060.46271: running the handler 28983 1726883060.46442: variable 'interface_stat' from source: set_fact 28983 1726883060.46462: Evaluated conditional (interface_stat.stat.exists): True 28983 1726883060.46475: handler run complete 28983 1726883060.46503: attempt loop complete, returning result 28983 1726883060.46506: _execute() done 28983 1726883060.46509: dumping result to json 28983 1726883060.46512: done dumping result, returning 28983 1726883060.46537: done running TaskExecutor() for managed_node2/TASK: Assert that the interface is present - 'statebr' [0affe814-3a2d-b16d-c0a7-0000000015d0] 28983 1726883060.46541: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000015d0 28983 1726883060.46627: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000015d0 28983 1726883060.46630: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 28983 1726883060.46695: no more pending results, returning what we have 28983 1726883060.46699: results queue empty 28983 1726883060.46700: checking for any_errors_fatal 28983 1726883060.46715: done checking for any_errors_fatal 28983 1726883060.46716: checking for max_fail_percentage 28983 1726883060.46718: done checking for max_fail_percentage 28983 1726883060.46719: checking to see if all hosts have failed and the running result is not ok 28983 1726883060.46720: done checking to see if all hosts have failed 28983 1726883060.46721: getting the remaining hosts for this loop 28983 1726883060.46723: done getting the remaining hosts for this loop 28983 1726883060.46728: getting the next task for host managed_node2 28983 1726883060.46741: done getting next task for host managed_node2 28983 1726883060.46744: ^ task is: TASK: Include the task 'get_profile_stat.yml' 28983 1726883060.46748: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883060.46752: getting variables 28983 1726883060.46754: in VariableManager get_vars() 28983 1726883060.46798: Calling all_inventory to load vars for managed_node2 28983 1726883060.46801: Calling groups_inventory to load vars for managed_node2 28983 1726883060.46805: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883060.46815: Calling all_plugins_play to load vars for managed_node2 28983 1726883060.46819: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883060.46822: Calling groups_plugins_play to load vars for managed_node2 28983 1726883060.48098: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883060.50033: done with get_vars() 28983 1726883060.50073: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:3 Friday 20 September 2024 21:44:20 -0400 (0:00:00.066) 0:01:30.499 ****** 28983 1726883060.50181: entering _queue_task() for managed_node2/include_tasks 28983 1726883060.50475: worker is 1 (out of 1 available) 28983 1726883060.50489: exiting _queue_task() for managed_node2/include_tasks 28983 1726883060.50504: done queuing things up, now waiting for results queue to drain 28983 1726883060.50506: waiting for pending results... 28983 1726883060.50729: running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' 28983 1726883060.50829: in run() - task 0affe814-3a2d-b16d-c0a7-0000000015d4 28983 1726883060.50846: variable 'ansible_search_path' from source: unknown 28983 1726883060.50849: variable 'ansible_search_path' from source: unknown 28983 1726883060.50944: calling self._execute() 28983 1726883060.51005: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883060.51016: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883060.51024: variable 'omit' from source: magic vars 28983 1726883060.51458: variable 'ansible_distribution_major_version' from source: facts 28983 1726883060.51475: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883060.51479: _execute() done 28983 1726883060.51482: dumping result to json 28983 1726883060.51641: done dumping result, returning 28983 1726883060.51644: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' [0affe814-3a2d-b16d-c0a7-0000000015d4] 28983 1726883060.51646: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000015d4 28983 1726883060.51716: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000015d4 28983 1726883060.51719: WORKER PROCESS EXITING 28983 1726883060.51750: no more pending results, returning what we have 28983 1726883060.51754: in VariableManager get_vars() 28983 1726883060.51795: Calling all_inventory to load vars for managed_node2 28983 1726883060.51799: Calling groups_inventory to load vars for managed_node2 28983 1726883060.51803: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883060.51813: Calling all_plugins_play to load vars for managed_node2 28983 1726883060.51816: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883060.51820: Calling groups_plugins_play to load vars for managed_node2 28983 1726883060.57537: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883060.59125: done with get_vars() 28983 1726883060.59149: variable 'ansible_search_path' from source: unknown 28983 1726883060.59151: variable 'ansible_search_path' from source: unknown 28983 1726883060.59157: variable 'item' from source: include params 28983 1726883060.59224: variable 'item' from source: include params 28983 1726883060.59254: we have included files to process 28983 1726883060.59255: generating all_blocks data 28983 1726883060.59257: done generating all_blocks data 28983 1726883060.59258: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 28983 1726883060.59259: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 28983 1726883060.59261: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 28983 1726883060.60006: done processing included file 28983 1726883060.60007: iterating over new_blocks loaded from include file 28983 1726883060.60008: in VariableManager get_vars() 28983 1726883060.60023: done with get_vars() 28983 1726883060.60025: filtering new block on tags 28983 1726883060.60080: done filtering new block on tags 28983 1726883060.60083: in VariableManager get_vars() 28983 1726883060.60094: done with get_vars() 28983 1726883060.60095: filtering new block on tags 28983 1726883060.60146: done filtering new block on tags 28983 1726883060.60148: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node2 28983 1726883060.60152: extending task lists for all hosts with included blocks 28983 1726883060.60346: done extending task lists 28983 1726883060.60347: done processing included files 28983 1726883060.60348: results queue empty 28983 1726883060.60349: checking for any_errors_fatal 28983 1726883060.60351: done checking for any_errors_fatal 28983 1726883060.60352: checking for max_fail_percentage 28983 1726883060.60353: done checking for max_fail_percentage 28983 1726883060.60353: checking to see if all hosts have failed and the running result is not ok 28983 1726883060.60354: done checking to see if all hosts have failed 28983 1726883060.60354: getting the remaining hosts for this loop 28983 1726883060.60355: done getting the remaining hosts for this loop 28983 1726883060.60357: getting the next task for host managed_node2 28983 1726883060.60361: done getting next task for host managed_node2 28983 1726883060.60362: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 28983 1726883060.60364: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883060.60366: getting variables 28983 1726883060.60367: in VariableManager get_vars() 28983 1726883060.60376: Calling all_inventory to load vars for managed_node2 28983 1726883060.60378: Calling groups_inventory to load vars for managed_node2 28983 1726883060.60380: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883060.60384: Calling all_plugins_play to load vars for managed_node2 28983 1726883060.60386: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883060.60388: Calling groups_plugins_play to load vars for managed_node2 28983 1726883060.61524: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883060.63120: done with get_vars() 28983 1726883060.63144: done getting variables 28983 1726883060.63179: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 21:44:20 -0400 (0:00:00.130) 0:01:30.629 ****** 28983 1726883060.63201: entering _queue_task() for managed_node2/set_fact 28983 1726883060.63494: worker is 1 (out of 1 available) 28983 1726883060.63506: exiting _queue_task() for managed_node2/set_fact 28983 1726883060.63519: done queuing things up, now waiting for results queue to drain 28983 1726883060.63521: waiting for pending results... 28983 1726883060.63720: running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag 28983 1726883060.63831: in run() - task 0affe814-3a2d-b16d-c0a7-000000001665 28983 1726883060.63845: variable 'ansible_search_path' from source: unknown 28983 1726883060.63850: variable 'ansible_search_path' from source: unknown 28983 1726883060.63887: calling self._execute() 28983 1726883060.63970: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883060.63978: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883060.63992: variable 'omit' from source: magic vars 28983 1726883060.64324: variable 'ansible_distribution_major_version' from source: facts 28983 1726883060.64337: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883060.64344: variable 'omit' from source: magic vars 28983 1726883060.64389: variable 'omit' from source: magic vars 28983 1726883060.64424: variable 'omit' from source: magic vars 28983 1726883060.64466: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883060.64504: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883060.64524: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883060.64547: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883060.64557: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883060.64589: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883060.64593: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883060.64596: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883060.64686: Set connection var ansible_connection to ssh 28983 1726883060.64696: Set connection var ansible_shell_executable to /bin/sh 28983 1726883060.64706: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883060.64714: Set connection var ansible_timeout to 10 28983 1726883060.64720: Set connection var ansible_pipelining to False 28983 1726883060.64723: Set connection var ansible_shell_type to sh 28983 1726883060.64748: variable 'ansible_shell_executable' from source: unknown 28983 1726883060.64752: variable 'ansible_connection' from source: unknown 28983 1726883060.64755: variable 'ansible_module_compression' from source: unknown 28983 1726883060.64759: variable 'ansible_shell_type' from source: unknown 28983 1726883060.64762: variable 'ansible_shell_executable' from source: unknown 28983 1726883060.64764: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883060.64774: variable 'ansible_pipelining' from source: unknown 28983 1726883060.64777: variable 'ansible_timeout' from source: unknown 28983 1726883060.64779: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883060.64898: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883060.64908: variable 'omit' from source: magic vars 28983 1726883060.64915: starting attempt loop 28983 1726883060.64917: running the handler 28983 1726883060.64930: handler run complete 28983 1726883060.64942: attempt loop complete, returning result 28983 1726883060.64946: _execute() done 28983 1726883060.64949: dumping result to json 28983 1726883060.64952: done dumping result, returning 28983 1726883060.64960: done running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag [0affe814-3a2d-b16d-c0a7-000000001665] 28983 1726883060.64966: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001665 28983 1726883060.65059: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001665 28983 1726883060.65064: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 28983 1726883060.65135: no more pending results, returning what we have 28983 1726883060.65139: results queue empty 28983 1726883060.65140: checking for any_errors_fatal 28983 1726883060.65142: done checking for any_errors_fatal 28983 1726883060.65143: checking for max_fail_percentage 28983 1726883060.65145: done checking for max_fail_percentage 28983 1726883060.65146: checking to see if all hosts have failed and the running result is not ok 28983 1726883060.65147: done checking to see if all hosts have failed 28983 1726883060.65147: getting the remaining hosts for this loop 28983 1726883060.65150: done getting the remaining hosts for this loop 28983 1726883060.65155: getting the next task for host managed_node2 28983 1726883060.65163: done getting next task for host managed_node2 28983 1726883060.65165: ^ task is: TASK: Stat profile file 28983 1726883060.65171: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883060.65176: getting variables 28983 1726883060.65178: in VariableManager get_vars() 28983 1726883060.65226: Calling all_inventory to load vars for managed_node2 28983 1726883060.65230: Calling groups_inventory to load vars for managed_node2 28983 1726883060.65233: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883060.65245: Calling all_plugins_play to load vars for managed_node2 28983 1726883060.65248: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883060.65252: Calling groups_plugins_play to load vars for managed_node2 28983 1726883060.66494: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883060.68109: done with get_vars() 28983 1726883060.68130: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 21:44:20 -0400 (0:00:00.050) 0:01:30.679 ****** 28983 1726883060.68208: entering _queue_task() for managed_node2/stat 28983 1726883060.68453: worker is 1 (out of 1 available) 28983 1726883060.68467: exiting _queue_task() for managed_node2/stat 28983 1726883060.68480: done queuing things up, now waiting for results queue to drain 28983 1726883060.68483: waiting for pending results... 28983 1726883060.68673: running TaskExecutor() for managed_node2/TASK: Stat profile file 28983 1726883060.68786: in run() - task 0affe814-3a2d-b16d-c0a7-000000001666 28983 1726883060.68799: variable 'ansible_search_path' from source: unknown 28983 1726883060.68803: variable 'ansible_search_path' from source: unknown 28983 1726883060.68839: calling self._execute() 28983 1726883060.68923: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883060.68927: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883060.68940: variable 'omit' from source: magic vars 28983 1726883060.69264: variable 'ansible_distribution_major_version' from source: facts 28983 1726883060.69274: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883060.69284: variable 'omit' from source: magic vars 28983 1726883060.69327: variable 'omit' from source: magic vars 28983 1726883060.69413: variable 'profile' from source: play vars 28983 1726883060.69418: variable 'interface' from source: play vars 28983 1726883060.69474: variable 'interface' from source: play vars 28983 1726883060.69497: variable 'omit' from source: magic vars 28983 1726883060.69535: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883060.69567: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883060.69588: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883060.69608: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883060.69617: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883060.69647: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883060.69650: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883060.69656: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883060.69741: Set connection var ansible_connection to ssh 28983 1726883060.69752: Set connection var ansible_shell_executable to /bin/sh 28983 1726883060.69761: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883060.69770: Set connection var ansible_timeout to 10 28983 1726883060.69778: Set connection var ansible_pipelining to False 28983 1726883060.69781: Set connection var ansible_shell_type to sh 28983 1726883060.69800: variable 'ansible_shell_executable' from source: unknown 28983 1726883060.69803: variable 'ansible_connection' from source: unknown 28983 1726883060.69806: variable 'ansible_module_compression' from source: unknown 28983 1726883060.69810: variable 'ansible_shell_type' from source: unknown 28983 1726883060.69813: variable 'ansible_shell_executable' from source: unknown 28983 1726883060.69824: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883060.69827: variable 'ansible_pipelining' from source: unknown 28983 1726883060.69831: variable 'ansible_timeout' from source: unknown 28983 1726883060.69833: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883060.70003: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28983 1726883060.70014: variable 'omit' from source: magic vars 28983 1726883060.70020: starting attempt loop 28983 1726883060.70023: running the handler 28983 1726883060.70039: _low_level_execute_command(): starting 28983 1726883060.70048: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726883060.70599: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883060.70603: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883060.70606: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883060.70608: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883060.70666: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883060.70669: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883060.70676: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883060.70757: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883060.72519: stdout chunk (state=3): >>>/root <<< 28983 1726883060.72625: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883060.72682: stderr chunk (state=3): >>><<< 28983 1726883060.72685: stdout chunk (state=3): >>><<< 28983 1726883060.72709: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883060.72720: _low_level_execute_command(): starting 28983 1726883060.72726: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726883060.7270892-32379-93326001901804 `" && echo ansible-tmp-1726883060.7270892-32379-93326001901804="` echo /root/.ansible/tmp/ansible-tmp-1726883060.7270892-32379-93326001901804 `" ) && sleep 0' 28983 1726883060.73190: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883060.73194: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726883060.73204: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration <<< 28983 1726883060.73208: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883060.73211: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883060.73259: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883060.73262: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883060.73346: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883060.75367: stdout chunk (state=3): >>>ansible-tmp-1726883060.7270892-32379-93326001901804=/root/.ansible/tmp/ansible-tmp-1726883060.7270892-32379-93326001901804 <<< 28983 1726883060.75485: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883060.75525: stderr chunk (state=3): >>><<< 28983 1726883060.75529: stdout chunk (state=3): >>><<< 28983 1726883060.75545: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726883060.7270892-32379-93326001901804=/root/.ansible/tmp/ansible-tmp-1726883060.7270892-32379-93326001901804 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883060.75583: variable 'ansible_module_compression' from source: unknown 28983 1726883060.75640: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 28983 1726883060.75671: variable 'ansible_facts' from source: unknown 28983 1726883060.75744: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726883060.7270892-32379-93326001901804/AnsiballZ_stat.py 28983 1726883060.75857: Sending initial data 28983 1726883060.75860: Sent initial data (152 bytes) 28983 1726883060.76294: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883060.76298: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883060.76326: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883060.76329: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883060.76331: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883060.76388: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883060.76396: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883060.76482: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883060.78122: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 28983 1726883060.78130: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726883060.78190: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726883060.78256: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmp4bmx4dy7 /root/.ansible/tmp/ansible-tmp-1726883060.7270892-32379-93326001901804/AnsiballZ_stat.py <<< 28983 1726883060.78267: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726883060.7270892-32379-93326001901804/AnsiballZ_stat.py" <<< 28983 1726883060.78329: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmp4bmx4dy7" to remote "/root/.ansible/tmp/ansible-tmp-1726883060.7270892-32379-93326001901804/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726883060.7270892-32379-93326001901804/AnsiballZ_stat.py" <<< 28983 1726883060.79240: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883060.79298: stderr chunk (state=3): >>><<< 28983 1726883060.79302: stdout chunk (state=3): >>><<< 28983 1726883060.79319: done transferring module to remote 28983 1726883060.79328: _low_level_execute_command(): starting 28983 1726883060.79335: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726883060.7270892-32379-93326001901804/ /root/.ansible/tmp/ansible-tmp-1726883060.7270892-32379-93326001901804/AnsiballZ_stat.py && sleep 0' 28983 1726883060.79783: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883060.79786: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726883060.79789: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address <<< 28983 1726883060.79791: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726883060.79797: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883060.79857: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883060.79861: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883060.79921: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883060.81744: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883060.81791: stderr chunk (state=3): >>><<< 28983 1726883060.81795: stdout chunk (state=3): >>><<< 28983 1726883060.81809: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883060.81812: _low_level_execute_command(): starting 28983 1726883060.81818: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726883060.7270892-32379-93326001901804/AnsiballZ_stat.py && sleep 0' 28983 1726883060.82262: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883060.82265: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883060.82267: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883060.82269: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883060.82330: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883060.82334: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883060.82400: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883060.99476: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 28983 1726883061.00994: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. <<< 28983 1726883061.00998: stdout chunk (state=3): >>><<< 28983 1726883061.01001: stderr chunk (state=3): >>><<< 28983 1726883061.01140: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726883061.01145: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726883060.7270892-32379-93326001901804/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726883061.01148: _low_level_execute_command(): starting 28983 1726883061.01151: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726883060.7270892-32379-93326001901804/ > /dev/null 2>&1 && sleep 0' 28983 1726883061.01804: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883061.01941: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883061.01954: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883061.01983: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883061.02099: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883061.04133: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883061.04192: stdout chunk (state=3): >>><<< 28983 1726883061.04196: stderr chunk (state=3): >>><<< 28983 1726883061.04236: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883061.04278: handler run complete 28983 1726883061.04341: attempt loop complete, returning result 28983 1726883061.04356: _execute() done 28983 1726883061.04359: dumping result to json 28983 1726883061.04361: done dumping result, returning 28983 1726883061.04438: done running TaskExecutor() for managed_node2/TASK: Stat profile file [0affe814-3a2d-b16d-c0a7-000000001666] 28983 1726883061.04442: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001666 28983 1726883061.04527: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001666 28983 1726883061.04532: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 28983 1726883061.04615: no more pending results, returning what we have 28983 1726883061.04620: results queue empty 28983 1726883061.04621: checking for any_errors_fatal 28983 1726883061.04633: done checking for any_errors_fatal 28983 1726883061.04636: checking for max_fail_percentage 28983 1726883061.04639: done checking for max_fail_percentage 28983 1726883061.04641: checking to see if all hosts have failed and the running result is not ok 28983 1726883061.04642: done checking to see if all hosts have failed 28983 1726883061.04643: getting the remaining hosts for this loop 28983 1726883061.04647: done getting the remaining hosts for this loop 28983 1726883061.04654: getting the next task for host managed_node2 28983 1726883061.04664: done getting next task for host managed_node2 28983 1726883061.04668: ^ task is: TASK: Set NM profile exist flag based on the profile files 28983 1726883061.04675: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883061.04680: getting variables 28983 1726883061.04682: in VariableManager get_vars() 28983 1726883061.04729: Calling all_inventory to load vars for managed_node2 28983 1726883061.04733: Calling groups_inventory to load vars for managed_node2 28983 1726883061.04812: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883061.04833: Calling all_plugins_play to load vars for managed_node2 28983 1726883061.04850: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883061.04855: Calling groups_plugins_play to load vars for managed_node2 28983 1726883061.06491: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883061.09249: done with get_vars() 28983 1726883061.09295: done getting variables 28983 1726883061.09366: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 21:44:21 -0400 (0:00:00.412) 0:01:31.091 ****** 28983 1726883061.09417: entering _queue_task() for managed_node2/set_fact 28983 1726883061.09865: worker is 1 (out of 1 available) 28983 1726883061.09878: exiting _queue_task() for managed_node2/set_fact 28983 1726883061.09889: done queuing things up, now waiting for results queue to drain 28983 1726883061.09890: waiting for pending results... 28983 1726883061.10261: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files 28983 1726883061.10312: in run() - task 0affe814-3a2d-b16d-c0a7-000000001667 28983 1726883061.10332: variable 'ansible_search_path' from source: unknown 28983 1726883061.10345: variable 'ansible_search_path' from source: unknown 28983 1726883061.10402: calling self._execute() 28983 1726883061.10526: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883061.10540: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883061.10558: variable 'omit' from source: magic vars 28983 1726883061.11072: variable 'ansible_distribution_major_version' from source: facts 28983 1726883061.11129: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883061.11264: variable 'profile_stat' from source: set_fact 28983 1726883061.11281: Evaluated conditional (profile_stat.stat.exists): False 28983 1726883061.11289: when evaluation is False, skipping this task 28983 1726883061.11296: _execute() done 28983 1726883061.11302: dumping result to json 28983 1726883061.11309: done dumping result, returning 28983 1726883061.11341: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files [0affe814-3a2d-b16d-c0a7-000000001667] 28983 1726883061.11349: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001667 skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 28983 1726883061.11627: no more pending results, returning what we have 28983 1726883061.11632: results queue empty 28983 1726883061.11635: checking for any_errors_fatal 28983 1726883061.11646: done checking for any_errors_fatal 28983 1726883061.11647: checking for max_fail_percentage 28983 1726883061.11649: done checking for max_fail_percentage 28983 1726883061.11650: checking to see if all hosts have failed and the running result is not ok 28983 1726883061.11651: done checking to see if all hosts have failed 28983 1726883061.11652: getting the remaining hosts for this loop 28983 1726883061.11654: done getting the remaining hosts for this loop 28983 1726883061.11658: getting the next task for host managed_node2 28983 1726883061.11666: done getting next task for host managed_node2 28983 1726883061.11668: ^ task is: TASK: Get NM profile info 28983 1726883061.11677: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883061.11684: getting variables 28983 1726883061.11685: in VariableManager get_vars() 28983 1726883061.11726: Calling all_inventory to load vars for managed_node2 28983 1726883061.11729: Calling groups_inventory to load vars for managed_node2 28983 1726883061.11733: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883061.11859: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001667 28983 1726883061.11862: WORKER PROCESS EXITING 28983 1726883061.11871: Calling all_plugins_play to load vars for managed_node2 28983 1726883061.11875: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883061.11879: Calling groups_plugins_play to load vars for managed_node2 28983 1726883061.15001: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883061.18163: done with get_vars() 28983 1726883061.18195: done getting variables 28983 1726883061.18268: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 21:44:21 -0400 (0:00:00.088) 0:01:31.180 ****** 28983 1726883061.18319: entering _queue_task() for managed_node2/shell 28983 1726883061.18702: worker is 1 (out of 1 available) 28983 1726883061.18718: exiting _queue_task() for managed_node2/shell 28983 1726883061.18852: done queuing things up, now waiting for results queue to drain 28983 1726883061.18854: waiting for pending results... 28983 1726883061.19078: running TaskExecutor() for managed_node2/TASK: Get NM profile info 28983 1726883061.19212: in run() - task 0affe814-3a2d-b16d-c0a7-000000001668 28983 1726883061.19237: variable 'ansible_search_path' from source: unknown 28983 1726883061.19242: variable 'ansible_search_path' from source: unknown 28983 1726883061.19281: calling self._execute() 28983 1726883061.19374: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883061.19379: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883061.19392: variable 'omit' from source: magic vars 28983 1726883061.19943: variable 'ansible_distribution_major_version' from source: facts 28983 1726883061.19947: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883061.19950: variable 'omit' from source: magic vars 28983 1726883061.19953: variable 'omit' from source: magic vars 28983 1726883061.20106: variable 'profile' from source: play vars 28983 1726883061.20118: variable 'interface' from source: play vars 28983 1726883061.20216: variable 'interface' from source: play vars 28983 1726883061.20247: variable 'omit' from source: magic vars 28983 1726883061.20341: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883061.20540: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883061.20544: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883061.20548: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883061.20550: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883061.20553: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883061.20556: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883061.20558: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883061.20628: Set connection var ansible_connection to ssh 28983 1726883061.20645: Set connection var ansible_shell_executable to /bin/sh 28983 1726883061.20656: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883061.20668: Set connection var ansible_timeout to 10 28983 1726883061.20677: Set connection var ansible_pipelining to False 28983 1726883061.20680: Set connection var ansible_shell_type to sh 28983 1726883061.20712: variable 'ansible_shell_executable' from source: unknown 28983 1726883061.20716: variable 'ansible_connection' from source: unknown 28983 1726883061.20728: variable 'ansible_module_compression' from source: unknown 28983 1726883061.20733: variable 'ansible_shell_type' from source: unknown 28983 1726883061.20742: variable 'ansible_shell_executable' from source: unknown 28983 1726883061.20745: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883061.20748: variable 'ansible_pipelining' from source: unknown 28983 1726883061.20750: variable 'ansible_timeout' from source: unknown 28983 1726883061.20757: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883061.21158: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883061.21162: variable 'omit' from source: magic vars 28983 1726883061.21166: starting attempt loop 28983 1726883061.21169: running the handler 28983 1726883061.21175: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883061.21179: _low_level_execute_command(): starting 28983 1726883061.21183: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726883061.21843: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883061.21894: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883061.21910: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883061.21991: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883061.23763: stdout chunk (state=3): >>>/root <<< 28983 1726883061.23915: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883061.23927: stderr chunk (state=3): >>><<< 28983 1726883061.23930: stdout chunk (state=3): >>><<< 28983 1726883061.23955: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883061.23966: _low_level_execute_command(): starting 28983 1726883061.23987: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726883061.2395525-32400-23344561761375 `" && echo ansible-tmp-1726883061.2395525-32400-23344561761375="` echo /root/.ansible/tmp/ansible-tmp-1726883061.2395525-32400-23344561761375 `" ) && sleep 0' 28983 1726883061.25447: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883061.25457: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883061.25460: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883061.25462: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883061.25564: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883061.27531: stdout chunk (state=3): >>>ansible-tmp-1726883061.2395525-32400-23344561761375=/root/.ansible/tmp/ansible-tmp-1726883061.2395525-32400-23344561761375 <<< 28983 1726883061.27640: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883061.27719: stderr chunk (state=3): >>><<< 28983 1726883061.27744: stdout chunk (state=3): >>><<< 28983 1726883061.27774: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726883061.2395525-32400-23344561761375=/root/.ansible/tmp/ansible-tmp-1726883061.2395525-32400-23344561761375 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883061.27808: variable 'ansible_module_compression' from source: unknown 28983 1726883061.27878: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 28983 1726883061.27921: variable 'ansible_facts' from source: unknown 28983 1726883061.28027: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726883061.2395525-32400-23344561761375/AnsiballZ_command.py 28983 1726883061.28197: Sending initial data 28983 1726883061.28200: Sent initial data (155 bytes) 28983 1726883061.28926: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883061.28939: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883061.29045: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883061.30714: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726883061.30791: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726883061.30868: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmp580nw14g /root/.ansible/tmp/ansible-tmp-1726883061.2395525-32400-23344561761375/AnsiballZ_command.py <<< 28983 1726883061.30872: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726883061.2395525-32400-23344561761375/AnsiballZ_command.py" <<< 28983 1726883061.30944: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmp580nw14g" to remote "/root/.ansible/tmp/ansible-tmp-1726883061.2395525-32400-23344561761375/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726883061.2395525-32400-23344561761375/AnsiballZ_command.py" <<< 28983 1726883061.32152: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883061.32254: stderr chunk (state=3): >>><<< 28983 1726883061.32259: stdout chunk (state=3): >>><<< 28983 1726883061.32265: done transferring module to remote 28983 1726883061.32282: _low_level_execute_command(): starting 28983 1726883061.32293: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726883061.2395525-32400-23344561761375/ /root/.ansible/tmp/ansible-tmp-1726883061.2395525-32400-23344561761375/AnsiballZ_command.py && sleep 0' 28983 1726883061.32959: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883061.33086: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883061.33108: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883061.33157: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883061.33232: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883061.35214: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883061.35218: stdout chunk (state=3): >>><<< 28983 1726883061.35220: stderr chunk (state=3): >>><<< 28983 1726883061.35239: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883061.35330: _low_level_execute_command(): starting 28983 1726883061.35336: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726883061.2395525-32400-23344561761375/AnsiballZ_command.py && sleep 0' 28983 1726883061.35840: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883061.35856: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883061.35869: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883061.35893: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883061.35911: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726883061.35925: stderr chunk (state=3): >>>debug2: match not found <<< 28983 1726883061.36031: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883061.36049: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883061.36079: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883061.36185: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883061.55346: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "start": "2024-09-20 21:44:21.533518", "end": "2024-09-20 21:44:21.552334", "delta": "0:00:00.018816", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 28983 1726883061.57023: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.46.139 closed. <<< 28983 1726883061.57399: stderr chunk (state=3): >>><<< 28983 1726883061.57403: stdout chunk (state=3): >>><<< 28983 1726883061.57407: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "start": "2024-09-20 21:44:21.533518", "end": "2024-09-20 21:44:21.552334", "delta": "0:00:00.018816", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.46.139 closed. 28983 1726883061.57410: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726883061.2395525-32400-23344561761375/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726883061.57413: _low_level_execute_command(): starting 28983 1726883061.57424: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726883061.2395525-32400-23344561761375/ > /dev/null 2>&1 && sleep 0' 28983 1726883061.58209: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883061.58217: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883061.58220: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883061.58223: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883061.58225: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883061.58291: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883061.58318: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883061.58418: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883061.60389: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883061.60407: stderr chunk (state=3): >>><<< 28983 1726883061.60413: stdout chunk (state=3): >>><<< 28983 1726883061.60427: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883061.60541: handler run complete 28983 1726883061.60544: Evaluated conditional (False): False 28983 1726883061.60546: attempt loop complete, returning result 28983 1726883061.60548: _execute() done 28983 1726883061.60551: dumping result to json 28983 1726883061.60553: done dumping result, returning 28983 1726883061.60554: done running TaskExecutor() for managed_node2/TASK: Get NM profile info [0affe814-3a2d-b16d-c0a7-000000001668] 28983 1726883061.60556: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001668 28983 1726883061.60670: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001668 28983 1726883061.60677: WORKER PROCESS EXITING fatal: [managed_node2]: FAILED! => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "delta": "0:00:00.018816", "end": "2024-09-20 21:44:21.552334", "rc": 1, "start": "2024-09-20 21:44:21.533518" } MSG: non-zero return code ...ignoring 28983 1726883061.60776: no more pending results, returning what we have 28983 1726883061.60780: results queue empty 28983 1726883061.60781: checking for any_errors_fatal 28983 1726883061.60794: done checking for any_errors_fatal 28983 1726883061.60795: checking for max_fail_percentage 28983 1726883061.60797: done checking for max_fail_percentage 28983 1726883061.60799: checking to see if all hosts have failed and the running result is not ok 28983 1726883061.60800: done checking to see if all hosts have failed 28983 1726883061.60801: getting the remaining hosts for this loop 28983 1726883061.60804: done getting the remaining hosts for this loop 28983 1726883061.60810: getting the next task for host managed_node2 28983 1726883061.60820: done getting next task for host managed_node2 28983 1726883061.60823: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 28983 1726883061.60830: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883061.60840: getting variables 28983 1726883061.60842: in VariableManager get_vars() 28983 1726883061.60895: Calling all_inventory to load vars for managed_node2 28983 1726883061.60898: Calling groups_inventory to load vars for managed_node2 28983 1726883061.60903: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883061.60915: Calling all_plugins_play to load vars for managed_node2 28983 1726883061.60920: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883061.60923: Calling groups_plugins_play to load vars for managed_node2 28983 1726883061.64037: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883061.67027: done with get_vars() 28983 1726883061.67066: done getting variables 28983 1726883061.67142: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 21:44:21 -0400 (0:00:00.488) 0:01:31.669 ****** 28983 1726883061.67186: entering _queue_task() for managed_node2/set_fact 28983 1726883061.67569: worker is 1 (out of 1 available) 28983 1726883061.67587: exiting _queue_task() for managed_node2/set_fact 28983 1726883061.67602: done queuing things up, now waiting for results queue to drain 28983 1726883061.67604: waiting for pending results... 28983 1726883061.68075: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 28983 1726883061.68191: in run() - task 0affe814-3a2d-b16d-c0a7-000000001669 28983 1726883061.68196: variable 'ansible_search_path' from source: unknown 28983 1726883061.68199: variable 'ansible_search_path' from source: unknown 28983 1726883061.68203: calling self._execute() 28983 1726883061.68337: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883061.68349: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883061.68362: variable 'omit' from source: magic vars 28983 1726883061.68912: variable 'ansible_distribution_major_version' from source: facts 28983 1726883061.68917: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883061.69060: variable 'nm_profile_exists' from source: set_fact 28983 1726883061.69079: Evaluated conditional (nm_profile_exists.rc == 0): False 28983 1726883061.69083: when evaluation is False, skipping this task 28983 1726883061.69086: _execute() done 28983 1726883061.69089: dumping result to json 28983 1726883061.69091: done dumping result, returning 28983 1726883061.69114: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0affe814-3a2d-b16d-c0a7-000000001669] 28983 1726883061.69117: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001669 28983 1726883061.69228: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001669 28983 1726883061.69230: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "nm_profile_exists.rc == 0", "skip_reason": "Conditional result was False" } 28983 1726883061.69311: no more pending results, returning what we have 28983 1726883061.69316: results queue empty 28983 1726883061.69317: checking for any_errors_fatal 28983 1726883061.69328: done checking for any_errors_fatal 28983 1726883061.69329: checking for max_fail_percentage 28983 1726883061.69331: done checking for max_fail_percentage 28983 1726883061.69332: checking to see if all hosts have failed and the running result is not ok 28983 1726883061.69333: done checking to see if all hosts have failed 28983 1726883061.69335: getting the remaining hosts for this loop 28983 1726883061.69337: done getting the remaining hosts for this loop 28983 1726883061.69342: getting the next task for host managed_node2 28983 1726883061.69353: done getting next task for host managed_node2 28983 1726883061.69356: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 28983 1726883061.69362: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883061.69366: getting variables 28983 1726883061.69368: in VariableManager get_vars() 28983 1726883061.69407: Calling all_inventory to load vars for managed_node2 28983 1726883061.69410: Calling groups_inventory to load vars for managed_node2 28983 1726883061.69414: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883061.69423: Calling all_plugins_play to load vars for managed_node2 28983 1726883061.69427: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883061.69430: Calling groups_plugins_play to load vars for managed_node2 28983 1726883061.71699: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883061.73310: done with get_vars() 28983 1726883061.73332: done getting variables 28983 1726883061.73382: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 28983 1726883061.73481: variable 'profile' from source: play vars 28983 1726883061.73484: variable 'interface' from source: play vars 28983 1726883061.73537: variable 'interface' from source: play vars TASK [Get the ansible_managed comment in ifcfg-statebr] ************************ task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 21:44:21 -0400 (0:00:00.063) 0:01:31.733 ****** 28983 1726883061.73564: entering _queue_task() for managed_node2/command 28983 1726883061.73823: worker is 1 (out of 1 available) 28983 1726883061.73840: exiting _queue_task() for managed_node2/command 28983 1726883061.73853: done queuing things up, now waiting for results queue to drain 28983 1726883061.73855: waiting for pending results... 28983 1726883061.74131: running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-statebr 28983 1726883061.74268: in run() - task 0affe814-3a2d-b16d-c0a7-00000000166b 28983 1726883061.74275: variable 'ansible_search_path' from source: unknown 28983 1726883061.74279: variable 'ansible_search_path' from source: unknown 28983 1726883061.74323: calling self._execute() 28983 1726883061.74454: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883061.74458: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883061.74461: variable 'omit' from source: magic vars 28983 1726883061.74854: variable 'ansible_distribution_major_version' from source: facts 28983 1726883061.74867: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883061.75014: variable 'profile_stat' from source: set_fact 28983 1726883061.75027: Evaluated conditional (profile_stat.stat.exists): False 28983 1726883061.75031: when evaluation is False, skipping this task 28983 1726883061.75035: _execute() done 28983 1726883061.75040: dumping result to json 28983 1726883061.75046: done dumping result, returning 28983 1726883061.75054: done running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-statebr [0affe814-3a2d-b16d-c0a7-00000000166b] 28983 1726883061.75107: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000166b 28983 1726883061.75178: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000166b 28983 1726883061.75181: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 28983 1726883061.75241: no more pending results, returning what we have 28983 1726883061.75244: results queue empty 28983 1726883061.75245: checking for any_errors_fatal 28983 1726883061.75251: done checking for any_errors_fatal 28983 1726883061.75252: checking for max_fail_percentage 28983 1726883061.75254: done checking for max_fail_percentage 28983 1726883061.75255: checking to see if all hosts have failed and the running result is not ok 28983 1726883061.75256: done checking to see if all hosts have failed 28983 1726883061.75257: getting the remaining hosts for this loop 28983 1726883061.75258: done getting the remaining hosts for this loop 28983 1726883061.75262: getting the next task for host managed_node2 28983 1726883061.75270: done getting next task for host managed_node2 28983 1726883061.75275: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 28983 1726883061.75282: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883061.75285: getting variables 28983 1726883061.75287: in VariableManager get_vars() 28983 1726883061.75322: Calling all_inventory to load vars for managed_node2 28983 1726883061.75324: Calling groups_inventory to load vars for managed_node2 28983 1726883061.75327: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883061.75381: Calling all_plugins_play to load vars for managed_node2 28983 1726883061.75386: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883061.75391: Calling groups_plugins_play to load vars for managed_node2 28983 1726883061.76836: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883061.78449: done with get_vars() 28983 1726883061.78471: done getting variables 28983 1726883061.78523: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 28983 1726883061.78614: variable 'profile' from source: play vars 28983 1726883061.78617: variable 'interface' from source: play vars 28983 1726883061.78664: variable 'interface' from source: play vars TASK [Verify the ansible_managed comment in ifcfg-statebr] ********************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 21:44:21 -0400 (0:00:00.051) 0:01:31.784 ****** 28983 1726883061.78691: entering _queue_task() for managed_node2/set_fact 28983 1726883061.78959: worker is 1 (out of 1 available) 28983 1726883061.78973: exiting _queue_task() for managed_node2/set_fact 28983 1726883061.78987: done queuing things up, now waiting for results queue to drain 28983 1726883061.78989: waiting for pending results... 28983 1726883061.79547: running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-statebr 28983 1726883061.79552: in run() - task 0affe814-3a2d-b16d-c0a7-00000000166c 28983 1726883061.79555: variable 'ansible_search_path' from source: unknown 28983 1726883061.79558: variable 'ansible_search_path' from source: unknown 28983 1726883061.79561: calling self._execute() 28983 1726883061.79702: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883061.79706: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883061.79710: variable 'omit' from source: magic vars 28983 1726883061.80147: variable 'ansible_distribution_major_version' from source: facts 28983 1726883061.80160: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883061.80327: variable 'profile_stat' from source: set_fact 28983 1726883061.80355: Evaluated conditional (profile_stat.stat.exists): False 28983 1726883061.80359: when evaluation is False, skipping this task 28983 1726883061.80362: _execute() done 28983 1726883061.80367: dumping result to json 28983 1726883061.80374: done dumping result, returning 28983 1726883061.80380: done running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-statebr [0affe814-3a2d-b16d-c0a7-00000000166c] 28983 1726883061.80387: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000166c 28983 1726883061.80496: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000166c 28983 1726883061.80500: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 28983 1726883061.80642: no more pending results, returning what we have 28983 1726883061.80647: results queue empty 28983 1726883061.80648: checking for any_errors_fatal 28983 1726883061.80654: done checking for any_errors_fatal 28983 1726883061.80655: checking for max_fail_percentage 28983 1726883061.80657: done checking for max_fail_percentage 28983 1726883061.80658: checking to see if all hosts have failed and the running result is not ok 28983 1726883061.80659: done checking to see if all hosts have failed 28983 1726883061.80660: getting the remaining hosts for this loop 28983 1726883061.80662: done getting the remaining hosts for this loop 28983 1726883061.80666: getting the next task for host managed_node2 28983 1726883061.80677: done getting next task for host managed_node2 28983 1726883061.80680: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 28983 1726883061.80686: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883061.80690: getting variables 28983 1726883061.80691: in VariableManager get_vars() 28983 1726883061.80728: Calling all_inventory to load vars for managed_node2 28983 1726883061.80732: Calling groups_inventory to load vars for managed_node2 28983 1726883061.80774: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883061.80786: Calling all_plugins_play to load vars for managed_node2 28983 1726883061.80790: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883061.80794: Calling groups_plugins_play to load vars for managed_node2 28983 1726883061.82046: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883061.83868: done with get_vars() 28983 1726883061.83906: done getting variables 28983 1726883061.83982: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 28983 1726883061.84109: variable 'profile' from source: play vars 28983 1726883061.84113: variable 'interface' from source: play vars 28983 1726883061.84187: variable 'interface' from source: play vars TASK [Get the fingerprint comment in ifcfg-statebr] **************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 21:44:21 -0400 (0:00:00.055) 0:01:31.840 ****** 28983 1726883061.84224: entering _queue_task() for managed_node2/command 28983 1726883061.84558: worker is 1 (out of 1 available) 28983 1726883061.84573: exiting _queue_task() for managed_node2/command 28983 1726883061.84586: done queuing things up, now waiting for results queue to drain 28983 1726883061.84588: waiting for pending results... 28983 1726883061.84846: running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-statebr 28983 1726883061.84965: in run() - task 0affe814-3a2d-b16d-c0a7-00000000166d 28983 1726883061.85006: variable 'ansible_search_path' from source: unknown 28983 1726883061.85025: variable 'ansible_search_path' from source: unknown 28983 1726883061.85138: calling self._execute() 28983 1726883061.85196: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883061.85209: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883061.85229: variable 'omit' from source: magic vars 28983 1726883061.85747: variable 'ansible_distribution_major_version' from source: facts 28983 1726883061.85767: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883061.85959: variable 'profile_stat' from source: set_fact 28983 1726883061.85982: Evaluated conditional (profile_stat.stat.exists): False 28983 1726883061.85993: when evaluation is False, skipping this task 28983 1726883061.86016: _execute() done 28983 1726883061.86053: dumping result to json 28983 1726883061.86057: done dumping result, returning 28983 1726883061.86060: done running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-statebr [0affe814-3a2d-b16d-c0a7-00000000166d] 28983 1726883061.86062: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000166d skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 28983 1726883061.86228: no more pending results, returning what we have 28983 1726883061.86233: results queue empty 28983 1726883061.86238: checking for any_errors_fatal 28983 1726883061.86249: done checking for any_errors_fatal 28983 1726883061.86250: checking for max_fail_percentage 28983 1726883061.86252: done checking for max_fail_percentage 28983 1726883061.86253: checking to see if all hosts have failed and the running result is not ok 28983 1726883061.86254: done checking to see if all hosts have failed 28983 1726883061.86255: getting the remaining hosts for this loop 28983 1726883061.86257: done getting the remaining hosts for this loop 28983 1726883061.86263: getting the next task for host managed_node2 28983 1726883061.86271: done getting next task for host managed_node2 28983 1726883061.86276: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 28983 1726883061.86281: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883061.86286: getting variables 28983 1726883061.86287: in VariableManager get_vars() 28983 1726883061.86324: Calling all_inventory to load vars for managed_node2 28983 1726883061.86327: Calling groups_inventory to load vars for managed_node2 28983 1726883061.86339: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883061.86349: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000166d 28983 1726883061.86353: WORKER PROCESS EXITING 28983 1726883061.86362: Calling all_plugins_play to load vars for managed_node2 28983 1726883061.86366: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883061.86369: Calling groups_plugins_play to load vars for managed_node2 28983 1726883061.87756: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883061.89713: done with get_vars() 28983 1726883061.89749: done getting variables 28983 1726883061.89824: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 28983 1726883061.89979: variable 'profile' from source: play vars 28983 1726883061.89983: variable 'interface' from source: play vars 28983 1726883061.90075: variable 'interface' from source: play vars TASK [Verify the fingerprint comment in ifcfg-statebr] ************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 21:44:21 -0400 (0:00:00.058) 0:01:31.898 ****** 28983 1726883061.90102: entering _queue_task() for managed_node2/set_fact 28983 1726883061.90354: worker is 1 (out of 1 available) 28983 1726883061.90369: exiting _queue_task() for managed_node2/set_fact 28983 1726883061.90384: done queuing things up, now waiting for results queue to drain 28983 1726883061.90387: waiting for pending results... 28983 1726883061.90577: running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-statebr 28983 1726883061.90683: in run() - task 0affe814-3a2d-b16d-c0a7-00000000166e 28983 1726883061.90695: variable 'ansible_search_path' from source: unknown 28983 1726883061.90699: variable 'ansible_search_path' from source: unknown 28983 1726883061.90739: calling self._execute() 28983 1726883061.90815: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883061.90821: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883061.90834: variable 'omit' from source: magic vars 28983 1726883061.91144: variable 'ansible_distribution_major_version' from source: facts 28983 1726883061.91156: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883061.91259: variable 'profile_stat' from source: set_fact 28983 1726883061.91270: Evaluated conditional (profile_stat.stat.exists): False 28983 1726883061.91278: when evaluation is False, skipping this task 28983 1726883061.91282: _execute() done 28983 1726883061.91285: dumping result to json 28983 1726883061.91287: done dumping result, returning 28983 1726883061.91299: done running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-statebr [0affe814-3a2d-b16d-c0a7-00000000166e] 28983 1726883061.91302: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000166e 28983 1726883061.91399: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000166e 28983 1726883061.91402: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 28983 1726883061.91454: no more pending results, returning what we have 28983 1726883061.91458: results queue empty 28983 1726883061.91459: checking for any_errors_fatal 28983 1726883061.91464: done checking for any_errors_fatal 28983 1726883061.91465: checking for max_fail_percentage 28983 1726883061.91467: done checking for max_fail_percentage 28983 1726883061.91468: checking to see if all hosts have failed and the running result is not ok 28983 1726883061.91469: done checking to see if all hosts have failed 28983 1726883061.91470: getting the remaining hosts for this loop 28983 1726883061.91474: done getting the remaining hosts for this loop 28983 1726883061.91478: getting the next task for host managed_node2 28983 1726883061.91488: done getting next task for host managed_node2 28983 1726883061.91491: ^ task is: TASK: Assert that the profile is absent - '{{ profile }}' 28983 1726883061.91496: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883061.91500: getting variables 28983 1726883061.91501: in VariableManager get_vars() 28983 1726883061.91533: Calling all_inventory to load vars for managed_node2 28983 1726883061.91538: Calling groups_inventory to load vars for managed_node2 28983 1726883061.91541: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883061.91551: Calling all_plugins_play to load vars for managed_node2 28983 1726883061.91553: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883061.91556: Calling groups_plugins_play to load vars for managed_node2 28983 1726883061.92904: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883061.94511: done with get_vars() 28983 1726883061.94535: done getting variables 28983 1726883061.94585: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 28983 1726883061.94677: variable 'profile' from source: play vars 28983 1726883061.94681: variable 'interface' from source: play vars 28983 1726883061.94724: variable 'interface' from source: play vars TASK [Assert that the profile is absent - 'statebr'] *************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:5 Friday 20 September 2024 21:44:21 -0400 (0:00:00.046) 0:01:31.945 ****** 28983 1726883061.94751: entering _queue_task() for managed_node2/assert 28983 1726883061.94972: worker is 1 (out of 1 available) 28983 1726883061.94986: exiting _queue_task() for managed_node2/assert 28983 1726883061.95000: done queuing things up, now waiting for results queue to drain 28983 1726883061.95002: waiting for pending results... 28983 1726883061.95202: running TaskExecutor() for managed_node2/TASK: Assert that the profile is absent - 'statebr' 28983 1726883061.95288: in run() - task 0affe814-3a2d-b16d-c0a7-0000000015d5 28983 1726883061.95302: variable 'ansible_search_path' from source: unknown 28983 1726883061.95305: variable 'ansible_search_path' from source: unknown 28983 1726883061.95340: calling self._execute() 28983 1726883061.95427: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883061.95436: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883061.95448: variable 'omit' from source: magic vars 28983 1726883061.95765: variable 'ansible_distribution_major_version' from source: facts 28983 1726883061.95779: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883061.95795: variable 'omit' from source: magic vars 28983 1726883061.95832: variable 'omit' from source: magic vars 28983 1726883061.95917: variable 'profile' from source: play vars 28983 1726883061.95922: variable 'interface' from source: play vars 28983 1726883061.95975: variable 'interface' from source: play vars 28983 1726883061.95994: variable 'omit' from source: magic vars 28983 1726883061.96036: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883061.96069: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883061.96091: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883061.96107: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883061.96120: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883061.96154: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883061.96157: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883061.96162: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883061.96247: Set connection var ansible_connection to ssh 28983 1726883061.96257: Set connection var ansible_shell_executable to /bin/sh 28983 1726883061.96266: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883061.96277: Set connection var ansible_timeout to 10 28983 1726883061.96283: Set connection var ansible_pipelining to False 28983 1726883061.96287: Set connection var ansible_shell_type to sh 28983 1726883061.96306: variable 'ansible_shell_executable' from source: unknown 28983 1726883061.96309: variable 'ansible_connection' from source: unknown 28983 1726883061.96312: variable 'ansible_module_compression' from source: unknown 28983 1726883061.96315: variable 'ansible_shell_type' from source: unknown 28983 1726883061.96320: variable 'ansible_shell_executable' from source: unknown 28983 1726883061.96323: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883061.96330: variable 'ansible_pipelining' from source: unknown 28983 1726883061.96333: variable 'ansible_timeout' from source: unknown 28983 1726883061.96347: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883061.96458: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883061.96470: variable 'omit' from source: magic vars 28983 1726883061.96478: starting attempt loop 28983 1726883061.96481: running the handler 28983 1726883061.96590: variable 'lsr_net_profile_exists' from source: set_fact 28983 1726883061.96596: Evaluated conditional (not lsr_net_profile_exists): True 28983 1726883061.96602: handler run complete 28983 1726883061.96616: attempt loop complete, returning result 28983 1726883061.96619: _execute() done 28983 1726883061.96622: dumping result to json 28983 1726883061.96627: done dumping result, returning 28983 1726883061.96637: done running TaskExecutor() for managed_node2/TASK: Assert that the profile is absent - 'statebr' [0affe814-3a2d-b16d-c0a7-0000000015d5] 28983 1726883061.96642: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000015d5 28983 1726883061.96726: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000015d5 28983 1726883061.96729: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 28983 1726883061.96790: no more pending results, returning what we have 28983 1726883061.96793: results queue empty 28983 1726883061.96794: checking for any_errors_fatal 28983 1726883061.96801: done checking for any_errors_fatal 28983 1726883061.96802: checking for max_fail_percentage 28983 1726883061.96804: done checking for max_fail_percentage 28983 1726883061.96805: checking to see if all hosts have failed and the running result is not ok 28983 1726883061.96806: done checking to see if all hosts have failed 28983 1726883061.96807: getting the remaining hosts for this loop 28983 1726883061.96809: done getting the remaining hosts for this loop 28983 1726883061.96814: getting the next task for host managed_node2 28983 1726883061.96822: done getting next task for host managed_node2 28983 1726883061.96826: ^ task is: TASK: Conditional asserts 28983 1726883061.96829: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883061.96833: getting variables 28983 1726883061.96837: in VariableManager get_vars() 28983 1726883061.96872: Calling all_inventory to load vars for managed_node2 28983 1726883061.96875: Calling groups_inventory to load vars for managed_node2 28983 1726883061.96879: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883061.96888: Calling all_plugins_play to load vars for managed_node2 28983 1726883061.96891: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883061.96895: Calling groups_plugins_play to load vars for managed_node2 28983 1726883061.98140: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883061.99753: done with get_vars() 28983 1726883061.99781: done getting variables TASK [Conditional asserts] ***************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:42 Friday 20 September 2024 21:44:21 -0400 (0:00:00.051) 0:01:31.996 ****** 28983 1726883061.99858: entering _queue_task() for managed_node2/include_tasks 28983 1726883062.00075: worker is 1 (out of 1 available) 28983 1726883062.00092: exiting _queue_task() for managed_node2/include_tasks 28983 1726883062.00103: done queuing things up, now waiting for results queue to drain 28983 1726883062.00105: waiting for pending results... 28983 1726883062.00295: running TaskExecutor() for managed_node2/TASK: Conditional asserts 28983 1726883062.00380: in run() - task 0affe814-3a2d-b16d-c0a7-00000000100b 28983 1726883062.00392: variable 'ansible_search_path' from source: unknown 28983 1726883062.00396: variable 'ansible_search_path' from source: unknown 28983 1726883062.00635: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883062.02701: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883062.02759: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883062.02791: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883062.02821: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883062.02849: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883062.02917: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883062.02942: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883062.02967: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883062.03002: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883062.03014: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883062.03139: dumping result to json 28983 1726883062.03143: done dumping result, returning 28983 1726883062.03150: done running TaskExecutor() for managed_node2/TASK: Conditional asserts [0affe814-3a2d-b16d-c0a7-00000000100b] 28983 1726883062.03158: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000100b 28983 1726883062.03261: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000100b 28983 1726883062.03265: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } 28983 1726883062.03322: no more pending results, returning what we have 28983 1726883062.03326: results queue empty 28983 1726883062.03327: checking for any_errors_fatal 28983 1726883062.03333: done checking for any_errors_fatal 28983 1726883062.03336: checking for max_fail_percentage 28983 1726883062.03338: done checking for max_fail_percentage 28983 1726883062.03339: checking to see if all hosts have failed and the running result is not ok 28983 1726883062.03340: done checking to see if all hosts have failed 28983 1726883062.03341: getting the remaining hosts for this loop 28983 1726883062.03343: done getting the remaining hosts for this loop 28983 1726883062.03348: getting the next task for host managed_node2 28983 1726883062.03355: done getting next task for host managed_node2 28983 1726883062.03358: ^ task is: TASK: Success in test '{{ lsr_description }}' 28983 1726883062.03361: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883062.03365: getting variables 28983 1726883062.03367: in VariableManager get_vars() 28983 1726883062.03407: Calling all_inventory to load vars for managed_node2 28983 1726883062.03410: Calling groups_inventory to load vars for managed_node2 28983 1726883062.03414: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883062.03424: Calling all_plugins_play to load vars for managed_node2 28983 1726883062.03427: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883062.03430: Calling groups_plugins_play to load vars for managed_node2 28983 1726883062.04830: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883062.06421: done with get_vars() 28983 1726883062.06445: done getting variables 28983 1726883062.06495: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 28983 1726883062.06586: variable 'lsr_description' from source: include params TASK [Success in test 'I can remove an existing profile without taking it down'] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:47 Friday 20 September 2024 21:44:22 -0400 (0:00:00.067) 0:01:32.063 ****** 28983 1726883062.06614: entering _queue_task() for managed_node2/debug 28983 1726883062.06836: worker is 1 (out of 1 available) 28983 1726883062.06852: exiting _queue_task() for managed_node2/debug 28983 1726883062.06865: done queuing things up, now waiting for results queue to drain 28983 1726883062.06866: waiting for pending results... 28983 1726883062.07062: running TaskExecutor() for managed_node2/TASK: Success in test 'I can remove an existing profile without taking it down' 28983 1726883062.07156: in run() - task 0affe814-3a2d-b16d-c0a7-00000000100c 28983 1726883062.07169: variable 'ansible_search_path' from source: unknown 28983 1726883062.07173: variable 'ansible_search_path' from source: unknown 28983 1726883062.07210: calling self._execute() 28983 1726883062.07292: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883062.07299: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883062.07313: variable 'omit' from source: magic vars 28983 1726883062.07628: variable 'ansible_distribution_major_version' from source: facts 28983 1726883062.07640: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883062.07652: variable 'omit' from source: magic vars 28983 1726883062.07684: variable 'omit' from source: magic vars 28983 1726883062.07770: variable 'lsr_description' from source: include params 28983 1726883062.07789: variable 'omit' from source: magic vars 28983 1726883062.07825: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883062.07860: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883062.07887: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883062.07901: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883062.07912: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883062.07942: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883062.07945: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883062.07950: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883062.08035: Set connection var ansible_connection to ssh 28983 1726883062.08046: Set connection var ansible_shell_executable to /bin/sh 28983 1726883062.08055: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883062.08063: Set connection var ansible_timeout to 10 28983 1726883062.08071: Set connection var ansible_pipelining to False 28983 1726883062.08073: Set connection var ansible_shell_type to sh 28983 1726883062.08098: variable 'ansible_shell_executable' from source: unknown 28983 1726883062.08102: variable 'ansible_connection' from source: unknown 28983 1726883062.08105: variable 'ansible_module_compression' from source: unknown 28983 1726883062.08107: variable 'ansible_shell_type' from source: unknown 28983 1726883062.08110: variable 'ansible_shell_executable' from source: unknown 28983 1726883062.08115: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883062.08117: variable 'ansible_pipelining' from source: unknown 28983 1726883062.08122: variable 'ansible_timeout' from source: unknown 28983 1726883062.08127: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883062.08249: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883062.08259: variable 'omit' from source: magic vars 28983 1726883062.08264: starting attempt loop 28983 1726883062.08267: running the handler 28983 1726883062.08317: handler run complete 28983 1726883062.08334: attempt loop complete, returning result 28983 1726883062.08337: _execute() done 28983 1726883062.08341: dumping result to json 28983 1726883062.08343: done dumping result, returning 28983 1726883062.08351: done running TaskExecutor() for managed_node2/TASK: Success in test 'I can remove an existing profile without taking it down' [0affe814-3a2d-b16d-c0a7-00000000100c] 28983 1726883062.08356: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000100c 28983 1726883062.08447: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000100c 28983 1726883062.08450: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: +++++ Success in test 'I can remove an existing profile without taking it down' +++++ 28983 1726883062.08502: no more pending results, returning what we have 28983 1726883062.08505: results queue empty 28983 1726883062.08506: checking for any_errors_fatal 28983 1726883062.08513: done checking for any_errors_fatal 28983 1726883062.08513: checking for max_fail_percentage 28983 1726883062.08516: done checking for max_fail_percentage 28983 1726883062.08517: checking to see if all hosts have failed and the running result is not ok 28983 1726883062.08518: done checking to see if all hosts have failed 28983 1726883062.08519: getting the remaining hosts for this loop 28983 1726883062.08521: done getting the remaining hosts for this loop 28983 1726883062.08525: getting the next task for host managed_node2 28983 1726883062.08532: done getting next task for host managed_node2 28983 1726883062.08546: ^ task is: TASK: Cleanup 28983 1726883062.08549: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883062.08554: getting variables 28983 1726883062.08555: in VariableManager get_vars() 28983 1726883062.08589: Calling all_inventory to load vars for managed_node2 28983 1726883062.08592: Calling groups_inventory to load vars for managed_node2 28983 1726883062.08596: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883062.08606: Calling all_plugins_play to load vars for managed_node2 28983 1726883062.08609: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883062.08612: Calling groups_plugins_play to load vars for managed_node2 28983 1726883062.09963: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883062.11567: done with get_vars() 28983 1726883062.11595: done getting variables TASK [Cleanup] ***************************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:66 Friday 20 September 2024 21:44:22 -0400 (0:00:00.050) 0:01:32.114 ****** 28983 1726883062.11665: entering _queue_task() for managed_node2/include_tasks 28983 1726883062.11878: worker is 1 (out of 1 available) 28983 1726883062.11892: exiting _queue_task() for managed_node2/include_tasks 28983 1726883062.11904: done queuing things up, now waiting for results queue to drain 28983 1726883062.11906: waiting for pending results... 28983 1726883062.12098: running TaskExecutor() for managed_node2/TASK: Cleanup 28983 1726883062.12186: in run() - task 0affe814-3a2d-b16d-c0a7-000000001010 28983 1726883062.12200: variable 'ansible_search_path' from source: unknown 28983 1726883062.12203: variable 'ansible_search_path' from source: unknown 28983 1726883062.12247: variable 'lsr_cleanup' from source: include params 28983 1726883062.12409: variable 'lsr_cleanup' from source: include params 28983 1726883062.12468: variable 'omit' from source: magic vars 28983 1726883062.12578: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883062.12591: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883062.12602: variable 'omit' from source: magic vars 28983 1726883062.12808: variable 'ansible_distribution_major_version' from source: facts 28983 1726883062.12815: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883062.12822: variable 'item' from source: unknown 28983 1726883062.12879: variable 'item' from source: unknown 28983 1726883062.12908: variable 'item' from source: unknown 28983 1726883062.12959: variable 'item' from source: unknown 28983 1726883062.13096: dumping result to json 28983 1726883062.13099: done dumping result, returning 28983 1726883062.13101: done running TaskExecutor() for managed_node2/TASK: Cleanup [0affe814-3a2d-b16d-c0a7-000000001010] 28983 1726883062.13103: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001010 28983 1726883062.13182: no more pending results, returning what we have 28983 1726883062.13187: in VariableManager get_vars() 28983 1726883062.13223: Calling all_inventory to load vars for managed_node2 28983 1726883062.13226: Calling groups_inventory to load vars for managed_node2 28983 1726883062.13229: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883062.13241: Calling all_plugins_play to load vars for managed_node2 28983 1726883062.13244: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883062.13260: Calling groups_plugins_play to load vars for managed_node2 28983 1726883062.14288: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001010 28983 1726883062.14293: WORKER PROCESS EXITING 28983 1726883062.14485: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883062.16077: done with get_vars() 28983 1726883062.16100: variable 'ansible_search_path' from source: unknown 28983 1726883062.16101: variable 'ansible_search_path' from source: unknown 28983 1726883062.16130: we have included files to process 28983 1726883062.16131: generating all_blocks data 28983 1726883062.16133: done generating all_blocks data 28983 1726883062.16140: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 28983 1726883062.16141: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 28983 1726883062.16143: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 28983 1726883062.16293: done processing included file 28983 1726883062.16295: iterating over new_blocks loaded from include file 28983 1726883062.16297: in VariableManager get_vars() 28983 1726883062.16311: done with get_vars() 28983 1726883062.16312: filtering new block on tags 28983 1726883062.16332: done filtering new block on tags 28983 1726883062.16335: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml for managed_node2 => (item=tasks/cleanup_profile+device.yml) 28983 1726883062.16339: extending task lists for all hosts with included blocks 28983 1726883062.17355: done extending task lists 28983 1726883062.17356: done processing included files 28983 1726883062.17357: results queue empty 28983 1726883062.17358: checking for any_errors_fatal 28983 1726883062.17361: done checking for any_errors_fatal 28983 1726883062.17361: checking for max_fail_percentage 28983 1726883062.17362: done checking for max_fail_percentage 28983 1726883062.17363: checking to see if all hosts have failed and the running result is not ok 28983 1726883062.17363: done checking to see if all hosts have failed 28983 1726883062.17364: getting the remaining hosts for this loop 28983 1726883062.17365: done getting the remaining hosts for this loop 28983 1726883062.17367: getting the next task for host managed_node2 28983 1726883062.17370: done getting next task for host managed_node2 28983 1726883062.17372: ^ task is: TASK: Cleanup profile and device 28983 1726883062.17375: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883062.17377: getting variables 28983 1726883062.17378: in VariableManager get_vars() 28983 1726883062.17388: Calling all_inventory to load vars for managed_node2 28983 1726883062.17390: Calling groups_inventory to load vars for managed_node2 28983 1726883062.17393: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883062.17398: Calling all_plugins_play to load vars for managed_node2 28983 1726883062.17399: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883062.17402: Calling groups_plugins_play to load vars for managed_node2 28983 1726883062.18552: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883062.20153: done with get_vars() 28983 1726883062.20174: done getting variables 28983 1726883062.20208: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Cleanup profile and device] ********************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml:3 Friday 20 September 2024 21:44:22 -0400 (0:00:00.085) 0:01:32.200 ****** 28983 1726883062.20232: entering _queue_task() for managed_node2/shell 28983 1726883062.20445: worker is 1 (out of 1 available) 28983 1726883062.20458: exiting _queue_task() for managed_node2/shell 28983 1726883062.20470: done queuing things up, now waiting for results queue to drain 28983 1726883062.20472: waiting for pending results... 28983 1726883062.20778: running TaskExecutor() for managed_node2/TASK: Cleanup profile and device 28983 1726883062.20881: in run() - task 0affe814-3a2d-b16d-c0a7-0000000016ad 28983 1726883062.20904: variable 'ansible_search_path' from source: unknown 28983 1726883062.20913: variable 'ansible_search_path' from source: unknown 28983 1726883062.20961: calling self._execute() 28983 1726883062.21077: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883062.21092: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883062.21108: variable 'omit' from source: magic vars 28983 1726883062.21546: variable 'ansible_distribution_major_version' from source: facts 28983 1726883062.21564: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883062.21580: variable 'omit' from source: magic vars 28983 1726883062.21666: variable 'omit' from source: magic vars 28983 1726883062.21813: variable 'interface' from source: play vars 28983 1726883062.21834: variable 'omit' from source: magic vars 28983 1726883062.21872: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883062.21904: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883062.21924: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883062.21945: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883062.21955: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883062.21986: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883062.21989: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883062.21994: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883062.22080: Set connection var ansible_connection to ssh 28983 1726883062.22092: Set connection var ansible_shell_executable to /bin/sh 28983 1726883062.22100: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883062.22109: Set connection var ansible_timeout to 10 28983 1726883062.22115: Set connection var ansible_pipelining to False 28983 1726883062.22118: Set connection var ansible_shell_type to sh 28983 1726883062.22141: variable 'ansible_shell_executable' from source: unknown 28983 1726883062.22146: variable 'ansible_connection' from source: unknown 28983 1726883062.22149: variable 'ansible_module_compression' from source: unknown 28983 1726883062.22152: variable 'ansible_shell_type' from source: unknown 28983 1726883062.22155: variable 'ansible_shell_executable' from source: unknown 28983 1726883062.22159: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883062.22171: variable 'ansible_pipelining' from source: unknown 28983 1726883062.22175: variable 'ansible_timeout' from source: unknown 28983 1726883062.22177: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883062.22303: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883062.22316: variable 'omit' from source: magic vars 28983 1726883062.22322: starting attempt loop 28983 1726883062.22325: running the handler 28983 1726883062.22338: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883062.22355: _low_level_execute_command(): starting 28983 1726883062.22364: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726883062.22903: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883062.22908: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883062.22912: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883062.22968: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883062.22976: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883062.23057: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883062.25185: stdout chunk (state=3): >>>/root <<< 28983 1726883062.25189: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883062.25193: stdout chunk (state=3): >>><<< 28983 1726883062.25196: stderr chunk (state=3): >>><<< 28983 1726883062.25204: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883062.25207: _low_level_execute_command(): starting 28983 1726883062.25209: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726883062.250894-32440-170588180899033 `" && echo ansible-tmp-1726883062.250894-32440-170588180899033="` echo /root/.ansible/tmp/ansible-tmp-1726883062.250894-32440-170588180899033 `" ) && sleep 0' 28983 1726883062.25803: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883062.25818: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883062.25833: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883062.25957: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883062.25976: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883062.26091: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883062.28120: stdout chunk (state=3): >>>ansible-tmp-1726883062.250894-32440-170588180899033=/root/.ansible/tmp/ansible-tmp-1726883062.250894-32440-170588180899033 <<< 28983 1726883062.28507: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883062.28511: stdout chunk (state=3): >>><<< 28983 1726883062.28519: stderr chunk (state=3): >>><<< 28983 1726883062.28538: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726883062.250894-32440-170588180899033=/root/.ansible/tmp/ansible-tmp-1726883062.250894-32440-170588180899033 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883062.28570: variable 'ansible_module_compression' from source: unknown 28983 1726883062.28623: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 28983 1726883062.28665: variable 'ansible_facts' from source: unknown 28983 1726883062.28756: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726883062.250894-32440-170588180899033/AnsiballZ_command.py 28983 1726883062.28992: Sending initial data 28983 1726883062.28996: Sent initial data (155 bytes) 28983 1726883062.29558: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883062.29562: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883062.29564: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883062.29651: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 28983 1726883062.29662: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883062.29678: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883062.29964: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883062.31668: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726883062.31737: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726883062.31808: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpyiw9e47r /root/.ansible/tmp/ansible-tmp-1726883062.250894-32440-170588180899033/AnsiballZ_command.py <<< 28983 1726883062.31813: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726883062.250894-32440-170588180899033/AnsiballZ_command.py" <<< 28983 1726883062.31883: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpyiw9e47r" to remote "/root/.ansible/tmp/ansible-tmp-1726883062.250894-32440-170588180899033/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726883062.250894-32440-170588180899033/AnsiballZ_command.py" <<< 28983 1726883062.34363: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883062.34407: stderr chunk (state=3): >>><<< 28983 1726883062.34410: stdout chunk (state=3): >>><<< 28983 1726883062.34444: done transferring module to remote 28983 1726883062.34452: _low_level_execute_command(): starting 28983 1726883062.34458: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726883062.250894-32440-170588180899033/ /root/.ansible/tmp/ansible-tmp-1726883062.250894-32440-170588180899033/AnsiballZ_command.py && sleep 0' 28983 1726883062.35515: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883062.35522: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883062.35539: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883062.35543: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883062.35562: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883062.35569: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883062.35620: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883062.35647: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883062.35720: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883062.37753: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883062.37790: stderr chunk (state=3): >>><<< 28983 1726883062.37809: stdout chunk (state=3): >>><<< 28983 1726883062.37832: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883062.37863: _low_level_execute_command(): starting 28983 1726883062.37867: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726883062.250894-32440-170588180899033/AnsiballZ_command.py && sleep 0' 28983 1726883062.38545: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883062.38561: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883062.38579: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883062.38613: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883062.38654: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883062.38720: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883062.38768: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883062.38791: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883062.38823: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883062.39032: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883062.63160: stdout chunk (state=3): >>> {"changed": true, "stdout": "Connection 'statebr' (3ac79eb6-77ee-484f-9752-0ce3ea88e423) successfully deleted.", "stderr": "Could not load file '/etc/sysconfig/network-scripts/ifcfg-statebr'", "rc": 0, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "start": "2024-09-20 21:44:22.562231", "end": "2024-09-20 21:44:22.625120", "delta": "0:00:00.062889", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 28983 1726883062.64560: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883062.64577: stderr chunk (state=3): >>>Shared connection to 10.31.46.139 closed. <<< 28983 1726883062.64674: stderr chunk (state=3): >>><<< 28983 1726883062.65025: stdout chunk (state=3): >>><<< 28983 1726883062.65030: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "Connection 'statebr' (3ac79eb6-77ee-484f-9752-0ce3ea88e423) successfully deleted.", "stderr": "Could not load file '/etc/sysconfig/network-scripts/ifcfg-statebr'", "rc": 0, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "start": "2024-09-20 21:44:22.562231", "end": "2024-09-20 21:44:22.625120", "delta": "0:00:00.062889", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726883062.65033: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726883062.250894-32440-170588180899033/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726883062.65042: _low_level_execute_command(): starting 28983 1726883062.65045: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726883062.250894-32440-170588180899033/ > /dev/null 2>&1 && sleep 0' 28983 1726883062.66250: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883062.66265: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883062.66281: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883062.66393: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883062.66406: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883062.66429: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883062.66515: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883062.68530: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883062.68591: stderr chunk (state=3): >>><<< 28983 1726883062.68754: stdout chunk (state=3): >>><<< 28983 1726883062.68940: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883062.68944: handler run complete 28983 1726883062.68946: Evaluated conditional (False): False 28983 1726883062.68949: attempt loop complete, returning result 28983 1726883062.68951: _execute() done 28983 1726883062.68953: dumping result to json 28983 1726883062.68955: done dumping result, returning 28983 1726883062.68958: done running TaskExecutor() for managed_node2/TASK: Cleanup profile and device [0affe814-3a2d-b16d-c0a7-0000000016ad] 28983 1726883062.68960: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000016ad ok: [managed_node2] => { "changed": false, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "delta": "0:00:00.062889", "end": "2024-09-20 21:44:22.625120", "rc": 0, "start": "2024-09-20 21:44:22.562231" } STDOUT: Connection 'statebr' (3ac79eb6-77ee-484f-9752-0ce3ea88e423) successfully deleted. STDERR: Could not load file '/etc/sysconfig/network-scripts/ifcfg-statebr' 28983 1726883062.69142: no more pending results, returning what we have 28983 1726883062.69148: results queue empty 28983 1726883062.69149: checking for any_errors_fatal 28983 1726883062.69151: done checking for any_errors_fatal 28983 1726883062.69152: checking for max_fail_percentage 28983 1726883062.69155: done checking for max_fail_percentage 28983 1726883062.69156: checking to see if all hosts have failed and the running result is not ok 28983 1726883062.69157: done checking to see if all hosts have failed 28983 1726883062.69158: getting the remaining hosts for this loop 28983 1726883062.69235: done getting the remaining hosts for this loop 28983 1726883062.69242: getting the next task for host managed_node2 28983 1726883062.69253: done getting next task for host managed_node2 28983 1726883062.69257: ^ task is: TASK: Include the task 'run_test.yml' 28983 1726883062.69260: ^ state is: HOST STATE: block=7, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883062.69264: getting variables 28983 1726883062.69265: in VariableManager get_vars() 28983 1726883062.69316: Calling all_inventory to load vars for managed_node2 28983 1726883062.69319: Calling groups_inventory to load vars for managed_node2 28983 1726883062.69322: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883062.69387: Calling all_plugins_play to load vars for managed_node2 28983 1726883062.69393: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883062.69398: Calling groups_plugins_play to load vars for managed_node2 28983 1726883062.69918: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000016ad 28983 1726883062.69923: WORKER PROCESS EXITING 28983 1726883062.79138: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883062.82928: done with get_vars() 28983 1726883062.82978: done getting variables TASK [Include the task 'run_test.yml'] ***************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_states.yml:102 Friday 20 September 2024 21:44:22 -0400 (0:00:00.628) 0:01:32.828 ****** 28983 1726883062.83080: entering _queue_task() for managed_node2/include_tasks 28983 1726883062.83567: worker is 1 (out of 1 available) 28983 1726883062.83581: exiting _queue_task() for managed_node2/include_tasks 28983 1726883062.83592: done queuing things up, now waiting for results queue to drain 28983 1726883062.83593: waiting for pending results... 28983 1726883062.83816: running TaskExecutor() for managed_node2/TASK: Include the task 'run_test.yml' 28983 1726883062.84058: in run() - task 0affe814-3a2d-b16d-c0a7-000000000015 28983 1726883062.84084: variable 'ansible_search_path' from source: unknown 28983 1726883062.84187: calling self._execute() 28983 1726883062.84313: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883062.84331: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883062.84354: variable 'omit' from source: magic vars 28983 1726883062.84827: variable 'ansible_distribution_major_version' from source: facts 28983 1726883062.84850: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883062.84863: _execute() done 28983 1726883062.84877: dumping result to json 28983 1726883062.84888: done dumping result, returning 28983 1726883062.84899: done running TaskExecutor() for managed_node2/TASK: Include the task 'run_test.yml' [0affe814-3a2d-b16d-c0a7-000000000015] 28983 1726883062.84914: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000015 28983 1726883062.85116: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000015 28983 1726883062.85119: WORKER PROCESS EXITING 28983 1726883062.85157: no more pending results, returning what we have 28983 1726883062.85163: in VariableManager get_vars() 28983 1726883062.85220: Calling all_inventory to load vars for managed_node2 28983 1726883062.85224: Calling groups_inventory to load vars for managed_node2 28983 1726883062.85228: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883062.85245: Calling all_plugins_play to load vars for managed_node2 28983 1726883062.85249: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883062.85254: Calling groups_plugins_play to load vars for managed_node2 28983 1726883062.87759: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883062.90963: done with get_vars() 28983 1726883062.91004: variable 'ansible_search_path' from source: unknown 28983 1726883062.91020: we have included files to process 28983 1726883062.91022: generating all_blocks data 28983 1726883062.91025: done generating all_blocks data 28983 1726883062.91045: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 28983 1726883062.91047: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 28983 1726883062.91051: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 28983 1726883062.91632: in VariableManager get_vars() 28983 1726883062.91659: done with get_vars() 28983 1726883062.91727: in VariableManager get_vars() 28983 1726883062.91753: done with get_vars() 28983 1726883062.91818: in VariableManager get_vars() 28983 1726883062.91844: done with get_vars() 28983 1726883062.91898: in VariableManager get_vars() 28983 1726883062.91932: done with get_vars() 28983 1726883062.91989: in VariableManager get_vars() 28983 1726883062.92019: done with get_vars() 28983 1726883062.92562: in VariableManager get_vars() 28983 1726883062.92588: done with get_vars() 28983 1726883062.92603: done processing included file 28983 1726883062.92605: iterating over new_blocks loaded from include file 28983 1726883062.92607: in VariableManager get_vars() 28983 1726883062.92623: done with get_vars() 28983 1726883062.92625: filtering new block on tags 28983 1726883062.92789: done filtering new block on tags 28983 1726883062.92792: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml for managed_node2 28983 1726883062.92798: extending task lists for all hosts with included blocks 28983 1726883062.92844: done extending task lists 28983 1726883062.92846: done processing included files 28983 1726883062.92847: results queue empty 28983 1726883062.92848: checking for any_errors_fatal 28983 1726883062.92854: done checking for any_errors_fatal 28983 1726883062.92855: checking for max_fail_percentage 28983 1726883062.92856: done checking for max_fail_percentage 28983 1726883062.92857: checking to see if all hosts have failed and the running result is not ok 28983 1726883062.92858: done checking to see if all hosts have failed 28983 1726883062.92859: getting the remaining hosts for this loop 28983 1726883062.92861: done getting the remaining hosts for this loop 28983 1726883062.92864: getting the next task for host managed_node2 28983 1726883062.92868: done getting next task for host managed_node2 28983 1726883062.92871: ^ task is: TASK: TEST: {{ lsr_description }} 28983 1726883062.92873: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883062.92876: getting variables 28983 1726883062.92877: in VariableManager get_vars() 28983 1726883062.92900: Calling all_inventory to load vars for managed_node2 28983 1726883062.92903: Calling groups_inventory to load vars for managed_node2 28983 1726883062.92906: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883062.92913: Calling all_plugins_play to load vars for managed_node2 28983 1726883062.92916: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883062.92920: Calling groups_plugins_play to load vars for managed_node2 28983 1726883062.95388: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883062.98398: done with get_vars() 28983 1726883062.98433: done getting variables 28983 1726883062.98488: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 28983 1726883062.98621: variable 'lsr_description' from source: include params TASK [TEST: I can take a profile down that is absent] ************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:5 Friday 20 September 2024 21:44:22 -0400 (0:00:00.155) 0:01:32.984 ****** 28983 1726883062.98656: entering _queue_task() for managed_node2/debug 28983 1726883062.99029: worker is 1 (out of 1 available) 28983 1726883062.99045: exiting _queue_task() for managed_node2/debug 28983 1726883062.99059: done queuing things up, now waiting for results queue to drain 28983 1726883062.99061: waiting for pending results... 28983 1726883062.99652: running TaskExecutor() for managed_node2/TASK: TEST: I can take a profile down that is absent 28983 1726883062.99657: in run() - task 0affe814-3a2d-b16d-c0a7-000000001744 28983 1726883062.99660: variable 'ansible_search_path' from source: unknown 28983 1726883062.99663: variable 'ansible_search_path' from source: unknown 28983 1726883062.99666: calling self._execute() 28983 1726883062.99710: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883062.99723: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883062.99742: variable 'omit' from source: magic vars 28983 1726883063.00206: variable 'ansible_distribution_major_version' from source: facts 28983 1726883063.00231: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883063.00246: variable 'omit' from source: magic vars 28983 1726883063.00301: variable 'omit' from source: magic vars 28983 1726883063.00440: variable 'lsr_description' from source: include params 28983 1726883063.00467: variable 'omit' from source: magic vars 28983 1726883063.00522: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883063.00575: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883063.00603: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883063.00625: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883063.00643: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883063.00740: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883063.00744: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883063.00746: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883063.00827: Set connection var ansible_connection to ssh 28983 1726883063.00861: Set connection var ansible_shell_executable to /bin/sh 28983 1726883063.00939: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883063.00943: Set connection var ansible_timeout to 10 28983 1726883063.00945: Set connection var ansible_pipelining to False 28983 1726883063.00948: Set connection var ansible_shell_type to sh 28983 1726883063.00950: variable 'ansible_shell_executable' from source: unknown 28983 1726883063.00953: variable 'ansible_connection' from source: unknown 28983 1726883063.00955: variable 'ansible_module_compression' from source: unknown 28983 1726883063.00957: variable 'ansible_shell_type' from source: unknown 28983 1726883063.00959: variable 'ansible_shell_executable' from source: unknown 28983 1726883063.00976: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883063.00987: variable 'ansible_pipelining' from source: unknown 28983 1726883063.00995: variable 'ansible_timeout' from source: unknown 28983 1726883063.01005: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883063.01194: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883063.01215: variable 'omit' from source: magic vars 28983 1726883063.01226: starting attempt loop 28983 1726883063.01238: running the handler 28983 1726883063.01408: handler run complete 28983 1726883063.01412: attempt loop complete, returning result 28983 1726883063.01414: _execute() done 28983 1726883063.01417: dumping result to json 28983 1726883063.01419: done dumping result, returning 28983 1726883063.01421: done running TaskExecutor() for managed_node2/TASK: TEST: I can take a profile down that is absent [0affe814-3a2d-b16d-c0a7-000000001744] 28983 1726883063.01424: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001744 28983 1726883063.01509: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001744 28983 1726883063.01745: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: ########## I can take a profile down that is absent ########## 28983 1726883063.01796: no more pending results, returning what we have 28983 1726883063.01800: results queue empty 28983 1726883063.01801: checking for any_errors_fatal 28983 1726883063.01804: done checking for any_errors_fatal 28983 1726883063.01805: checking for max_fail_percentage 28983 1726883063.01807: done checking for max_fail_percentage 28983 1726883063.01808: checking to see if all hosts have failed and the running result is not ok 28983 1726883063.01809: done checking to see if all hosts have failed 28983 1726883063.01810: getting the remaining hosts for this loop 28983 1726883063.01812: done getting the remaining hosts for this loop 28983 1726883063.01816: getting the next task for host managed_node2 28983 1726883063.01823: done getting next task for host managed_node2 28983 1726883063.01826: ^ task is: TASK: Show item 28983 1726883063.01829: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883063.01833: getting variables 28983 1726883063.01837: in VariableManager get_vars() 28983 1726883063.01877: Calling all_inventory to load vars for managed_node2 28983 1726883063.01881: Calling groups_inventory to load vars for managed_node2 28983 1726883063.01885: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883063.01896: Calling all_plugins_play to load vars for managed_node2 28983 1726883063.01900: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883063.01904: Calling groups_plugins_play to load vars for managed_node2 28983 1726883063.04316: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883063.07252: done with get_vars() 28983 1726883063.07290: done getting variables 28983 1726883063.07361: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show item] *************************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:9 Friday 20 September 2024 21:44:23 -0400 (0:00:00.087) 0:01:33.071 ****** 28983 1726883063.07400: entering _queue_task() for managed_node2/debug 28983 1726883063.07950: worker is 1 (out of 1 available) 28983 1726883063.07961: exiting _queue_task() for managed_node2/debug 28983 1726883063.07976: done queuing things up, now waiting for results queue to drain 28983 1726883063.07978: waiting for pending results... 28983 1726883063.08102: running TaskExecutor() for managed_node2/TASK: Show item 28983 1726883063.08241: in run() - task 0affe814-3a2d-b16d-c0a7-000000001745 28983 1726883063.08264: variable 'ansible_search_path' from source: unknown 28983 1726883063.08276: variable 'ansible_search_path' from source: unknown 28983 1726883063.08346: variable 'omit' from source: magic vars 28983 1726883063.08531: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883063.08555: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883063.08578: variable 'omit' from source: magic vars 28983 1726883063.09025: variable 'ansible_distribution_major_version' from source: facts 28983 1726883063.09048: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883063.09060: variable 'omit' from source: magic vars 28983 1726883063.09116: variable 'omit' from source: magic vars 28983 1726883063.09198: variable 'item' from source: unknown 28983 1726883063.09268: variable 'item' from source: unknown 28983 1726883063.09306: variable 'omit' from source: magic vars 28983 1726883063.09344: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883063.09415: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883063.09418: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883063.09445: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883063.09463: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883063.09508: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883063.09638: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883063.09642: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883063.09658: Set connection var ansible_connection to ssh 28983 1726883063.09677: Set connection var ansible_shell_executable to /bin/sh 28983 1726883063.09692: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883063.09705: Set connection var ansible_timeout to 10 28983 1726883063.09714: Set connection var ansible_pipelining to False 28983 1726883063.09720: Set connection var ansible_shell_type to sh 28983 1726883063.09747: variable 'ansible_shell_executable' from source: unknown 28983 1726883063.09760: variable 'ansible_connection' from source: unknown 28983 1726883063.09768: variable 'ansible_module_compression' from source: unknown 28983 1726883063.09779: variable 'ansible_shell_type' from source: unknown 28983 1726883063.09787: variable 'ansible_shell_executable' from source: unknown 28983 1726883063.09794: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883063.09839: variable 'ansible_pipelining' from source: unknown 28983 1726883063.09843: variable 'ansible_timeout' from source: unknown 28983 1726883063.09846: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883063.09994: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883063.10012: variable 'omit' from source: magic vars 28983 1726883063.10023: starting attempt loop 28983 1726883063.10030: running the handler 28983 1726883063.10096: variable 'lsr_description' from source: include params 28983 1726883063.10192: variable 'lsr_description' from source: include params 28983 1726883063.10201: handler run complete 28983 1726883063.10228: attempt loop complete, returning result 28983 1726883063.10301: variable 'item' from source: unknown 28983 1726883063.10339: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_description) => { "ansible_loop_var": "item", "item": "lsr_description", "lsr_description": "I can take a profile down that is absent" } 28983 1726883063.10841: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883063.10845: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883063.10847: variable 'omit' from source: magic vars 28983 1726883063.10849: variable 'ansible_distribution_major_version' from source: facts 28983 1726883063.10852: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883063.10862: variable 'omit' from source: magic vars 28983 1726883063.10887: variable 'omit' from source: magic vars 28983 1726883063.10942: variable 'item' from source: unknown 28983 1726883063.11025: variable 'item' from source: unknown 28983 1726883063.11052: variable 'omit' from source: magic vars 28983 1726883063.11086: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883063.11100: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883063.11111: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883063.11170: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883063.11176: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883063.11180: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883063.11248: Set connection var ansible_connection to ssh 28983 1726883063.11265: Set connection var ansible_shell_executable to /bin/sh 28983 1726883063.11289: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883063.11305: Set connection var ansible_timeout to 10 28983 1726883063.11316: Set connection var ansible_pipelining to False 28983 1726883063.11323: Set connection var ansible_shell_type to sh 28983 1726883063.11389: variable 'ansible_shell_executable' from source: unknown 28983 1726883063.11393: variable 'ansible_connection' from source: unknown 28983 1726883063.11395: variable 'ansible_module_compression' from source: unknown 28983 1726883063.11397: variable 'ansible_shell_type' from source: unknown 28983 1726883063.11399: variable 'ansible_shell_executable' from source: unknown 28983 1726883063.11402: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883063.11404: variable 'ansible_pipelining' from source: unknown 28983 1726883063.11406: variable 'ansible_timeout' from source: unknown 28983 1726883063.11408: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883063.11521: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883063.11542: variable 'omit' from source: magic vars 28983 1726883063.11606: starting attempt loop 28983 1726883063.11610: running the handler 28983 1726883063.11613: variable 'lsr_setup' from source: include params 28983 1726883063.11685: variable 'lsr_setup' from source: include params 28983 1726883063.11749: handler run complete 28983 1726883063.11777: attempt loop complete, returning result 28983 1726883063.11801: variable 'item' from source: unknown 28983 1726883063.11889: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_setup) => { "ansible_loop_var": "item", "item": "lsr_setup", "lsr_setup": [ "tasks/create_bridge_profile.yml", "tasks/activate_profile.yml", "tasks/remove_profile.yml" ] } 28983 1726883063.12141: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883063.12145: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883063.12148: variable 'omit' from source: magic vars 28983 1726883063.12320: variable 'ansible_distribution_major_version' from source: facts 28983 1726883063.12331: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883063.12343: variable 'omit' from source: magic vars 28983 1726883063.12367: variable 'omit' from source: magic vars 28983 1726883063.12423: variable 'item' from source: unknown 28983 1726883063.12509: variable 'item' from source: unknown 28983 1726883063.12531: variable 'omit' from source: magic vars 28983 1726883063.12556: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883063.12569: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883063.12739: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883063.12742: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883063.12745: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883063.12747: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883063.12749: Set connection var ansible_connection to ssh 28983 1726883063.12752: Set connection var ansible_shell_executable to /bin/sh 28983 1726883063.12754: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883063.12756: Set connection var ansible_timeout to 10 28983 1726883063.12758: Set connection var ansible_pipelining to False 28983 1726883063.12760: Set connection var ansible_shell_type to sh 28983 1726883063.12788: variable 'ansible_shell_executable' from source: unknown 28983 1726883063.12796: variable 'ansible_connection' from source: unknown 28983 1726883063.12804: variable 'ansible_module_compression' from source: unknown 28983 1726883063.12811: variable 'ansible_shell_type' from source: unknown 28983 1726883063.12819: variable 'ansible_shell_executable' from source: unknown 28983 1726883063.12826: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883063.12836: variable 'ansible_pipelining' from source: unknown 28983 1726883063.12844: variable 'ansible_timeout' from source: unknown 28983 1726883063.12852: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883063.12965: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883063.12985: variable 'omit' from source: magic vars 28983 1726883063.12994: starting attempt loop 28983 1726883063.13001: running the handler 28983 1726883063.13026: variable 'lsr_test' from source: include params 28983 1726883063.13150: variable 'lsr_test' from source: include params 28983 1726883063.13153: handler run complete 28983 1726883063.13158: attempt loop complete, returning result 28983 1726883063.13182: variable 'item' from source: unknown 28983 1726883063.13265: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_test) => { "ansible_loop_var": "item", "item": "lsr_test", "lsr_test": [ "tasks/remove+down_profile.yml" ] } 28983 1726883063.13475: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883063.13479: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883063.13482: variable 'omit' from source: magic vars 28983 1726883063.13683: variable 'ansible_distribution_major_version' from source: facts 28983 1726883063.13699: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883063.13708: variable 'omit' from source: magic vars 28983 1726883063.13728: variable 'omit' from source: magic vars 28983 1726883063.13787: variable 'item' from source: unknown 28983 1726883063.13871: variable 'item' from source: unknown 28983 1726883063.13908: variable 'omit' from source: magic vars 28983 1726883063.14016: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883063.14020: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883063.14022: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883063.14024: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883063.14027: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883063.14029: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883063.14080: Set connection var ansible_connection to ssh 28983 1726883063.14098: Set connection var ansible_shell_executable to /bin/sh 28983 1726883063.14113: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883063.14132: Set connection var ansible_timeout to 10 28983 1726883063.14146: Set connection var ansible_pipelining to False 28983 1726883063.14155: Set connection var ansible_shell_type to sh 28983 1726883063.14183: variable 'ansible_shell_executable' from source: unknown 28983 1726883063.14191: variable 'ansible_connection' from source: unknown 28983 1726883063.14199: variable 'ansible_module_compression' from source: unknown 28983 1726883063.14205: variable 'ansible_shell_type' from source: unknown 28983 1726883063.14212: variable 'ansible_shell_executable' from source: unknown 28983 1726883063.14220: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883063.14232: variable 'ansible_pipelining' from source: unknown 28983 1726883063.14242: variable 'ansible_timeout' from source: unknown 28983 1726883063.14251: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883063.14361: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883063.14378: variable 'omit' from source: magic vars 28983 1726883063.14451: starting attempt loop 28983 1726883063.14454: running the handler 28983 1726883063.14457: variable 'lsr_assert' from source: include params 28983 1726883063.14502: variable 'lsr_assert' from source: include params 28983 1726883063.14524: handler run complete 28983 1726883063.14550: attempt loop complete, returning result 28983 1726883063.14579: variable 'item' from source: unknown 28983 1726883063.14657: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_assert) => { "ansible_loop_var": "item", "item": "lsr_assert", "lsr_assert": [ "tasks/assert_profile_absent.yml" ] } 28983 1726883063.15040: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883063.15044: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883063.15047: variable 'omit' from source: magic vars 28983 1726883063.15131: variable 'ansible_distribution_major_version' from source: facts 28983 1726883063.15147: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883063.15166: variable 'omit' from source: magic vars 28983 1726883063.15190: variable 'omit' from source: magic vars 28983 1726883063.15245: variable 'item' from source: unknown 28983 1726883063.15329: variable 'item' from source: unknown 28983 1726883063.15354: variable 'omit' from source: magic vars 28983 1726883063.15384: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883063.15396: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883063.15406: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883063.15422: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883063.15430: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883063.15440: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883063.15522: Set connection var ansible_connection to ssh 28983 1726883063.15540: Set connection var ansible_shell_executable to /bin/sh 28983 1726883063.15553: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883063.15565: Set connection var ansible_timeout to 10 28983 1726883063.15580: Set connection var ansible_pipelining to False 28983 1726883063.15588: Set connection var ansible_shell_type to sh 28983 1726883063.15621: variable 'ansible_shell_executable' from source: unknown 28983 1726883063.15630: variable 'ansible_connection' from source: unknown 28983 1726883063.15642: variable 'ansible_module_compression' from source: unknown 28983 1726883063.15650: variable 'ansible_shell_type' from source: unknown 28983 1726883063.15657: variable 'ansible_shell_executable' from source: unknown 28983 1726883063.15664: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883063.15675: variable 'ansible_pipelining' from source: unknown 28983 1726883063.15684: variable 'ansible_timeout' from source: unknown 28983 1726883063.15693: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883063.15818: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883063.15833: variable 'omit' from source: magic vars 28983 1726883063.15926: starting attempt loop 28983 1726883063.15930: running the handler 28983 1726883063.15932: variable 'lsr_assert_when' from source: include params 28983 1726883063.15966: variable 'lsr_assert_when' from source: include params 28983 1726883063.16088: variable 'network_provider' from source: set_fact 28983 1726883063.16133: handler run complete 28983 1726883063.16166: attempt loop complete, returning result 28983 1726883063.16193: variable 'item' from source: unknown 28983 1726883063.16278: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_assert_when) => { "ansible_loop_var": "item", "item": "lsr_assert_when", "lsr_assert_when": [ { "condition": true, "what": "tasks/assert_device_absent.yml" } ] } 28983 1726883063.16562: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883063.16566: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883063.16568: variable 'omit' from source: magic vars 28983 1726883063.16692: variable 'ansible_distribution_major_version' from source: facts 28983 1726883063.16704: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883063.16714: variable 'omit' from source: magic vars 28983 1726883063.16737: variable 'omit' from source: magic vars 28983 1726883063.16799: variable 'item' from source: unknown 28983 1726883063.16902: variable 'item' from source: unknown 28983 1726883063.16905: variable 'omit' from source: magic vars 28983 1726883063.16928: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883063.16944: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883063.16956: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883063.16976: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883063.17011: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883063.17014: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883063.17093: Set connection var ansible_connection to ssh 28983 1726883063.17111: Set connection var ansible_shell_executable to /bin/sh 28983 1726883063.17240: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883063.17243: Set connection var ansible_timeout to 10 28983 1726883063.17246: Set connection var ansible_pipelining to False 28983 1726883063.17248: Set connection var ansible_shell_type to sh 28983 1726883063.17250: variable 'ansible_shell_executable' from source: unknown 28983 1726883063.17252: variable 'ansible_connection' from source: unknown 28983 1726883063.17255: variable 'ansible_module_compression' from source: unknown 28983 1726883063.17257: variable 'ansible_shell_type' from source: unknown 28983 1726883063.17260: variable 'ansible_shell_executable' from source: unknown 28983 1726883063.17262: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883063.17264: variable 'ansible_pipelining' from source: unknown 28983 1726883063.17266: variable 'ansible_timeout' from source: unknown 28983 1726883063.17268: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883063.17383: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883063.17387: variable 'omit' from source: magic vars 28983 1726883063.17389: starting attempt loop 28983 1726883063.17392: running the handler 28983 1726883063.17441: variable 'lsr_fail_debug' from source: play vars 28983 1726883063.17495: variable 'lsr_fail_debug' from source: play vars 28983 1726883063.17520: handler run complete 28983 1726883063.17544: attempt loop complete, returning result 28983 1726883063.17566: variable 'item' from source: unknown 28983 1726883063.17708: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_fail_debug) => { "ansible_loop_var": "item", "item": "lsr_fail_debug", "lsr_fail_debug": [ "__network_connections_result" ] } 28983 1726883063.17926: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883063.17929: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883063.17932: variable 'omit' from source: magic vars 28983 1726883063.18062: variable 'ansible_distribution_major_version' from source: facts 28983 1726883063.18078: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883063.18089: variable 'omit' from source: magic vars 28983 1726883063.18112: variable 'omit' from source: magic vars 28983 1726883063.18175: variable 'item' from source: unknown 28983 1726883063.18256: variable 'item' from source: unknown 28983 1726883063.18284: variable 'omit' from source: magic vars 28983 1726883063.18310: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883063.18327: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883063.18343: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883063.18367: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883063.18379: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883063.18388: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883063.18578: Set connection var ansible_connection to ssh 28983 1726883063.18582: Set connection var ansible_shell_executable to /bin/sh 28983 1726883063.18584: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883063.18586: Set connection var ansible_timeout to 10 28983 1726883063.18589: Set connection var ansible_pipelining to False 28983 1726883063.18591: Set connection var ansible_shell_type to sh 28983 1726883063.18593: variable 'ansible_shell_executable' from source: unknown 28983 1726883063.18595: variable 'ansible_connection' from source: unknown 28983 1726883063.18597: variable 'ansible_module_compression' from source: unknown 28983 1726883063.18599: variable 'ansible_shell_type' from source: unknown 28983 1726883063.18601: variable 'ansible_shell_executable' from source: unknown 28983 1726883063.18603: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883063.18605: variable 'ansible_pipelining' from source: unknown 28983 1726883063.18613: variable 'ansible_timeout' from source: unknown 28983 1726883063.18623: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883063.18740: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883063.18755: variable 'omit' from source: magic vars 28983 1726883063.18765: starting attempt loop 28983 1726883063.18776: running the handler 28983 1726883063.18807: variable 'lsr_cleanup' from source: include params 28983 1726883063.18890: variable 'lsr_cleanup' from source: include params 28983 1726883063.18916: handler run complete 28983 1726883063.18938: attempt loop complete, returning result 28983 1726883063.18959: variable 'item' from source: unknown 28983 1726883063.19041: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_cleanup) => { "ansible_loop_var": "item", "item": "lsr_cleanup", "lsr_cleanup": [ "tasks/cleanup_profile+device.yml" ] } 28983 1726883063.19183: dumping result to json 28983 1726883063.19186: done dumping result, returning 28983 1726883063.19188: done running TaskExecutor() for managed_node2/TASK: Show item [0affe814-3a2d-b16d-c0a7-000000001745] 28983 1726883063.19190: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001745 28983 1726883063.19390: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001745 28983 1726883063.19393: WORKER PROCESS EXITING 28983 1726883063.19521: no more pending results, returning what we have 28983 1726883063.19525: results queue empty 28983 1726883063.19526: checking for any_errors_fatal 28983 1726883063.19539: done checking for any_errors_fatal 28983 1726883063.19541: checking for max_fail_percentage 28983 1726883063.19543: done checking for max_fail_percentage 28983 1726883063.19544: checking to see if all hosts have failed and the running result is not ok 28983 1726883063.19545: done checking to see if all hosts have failed 28983 1726883063.19546: getting the remaining hosts for this loop 28983 1726883063.19548: done getting the remaining hosts for this loop 28983 1726883063.19553: getting the next task for host managed_node2 28983 1726883063.19562: done getting next task for host managed_node2 28983 1726883063.19565: ^ task is: TASK: Include the task 'show_interfaces.yml' 28983 1726883063.19568: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883063.19576: getting variables 28983 1726883063.19578: in VariableManager get_vars() 28983 1726883063.19623: Calling all_inventory to load vars for managed_node2 28983 1726883063.19627: Calling groups_inventory to load vars for managed_node2 28983 1726883063.19631: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883063.19846: Calling all_plugins_play to load vars for managed_node2 28983 1726883063.19851: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883063.19855: Calling groups_plugins_play to load vars for managed_node2 28983 1726883063.22138: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883063.25153: done with get_vars() 28983 1726883063.25192: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:21 Friday 20 September 2024 21:44:23 -0400 (0:00:00.179) 0:01:33.250 ****** 28983 1726883063.25308: entering _queue_task() for managed_node2/include_tasks 28983 1726883063.25711: worker is 1 (out of 1 available) 28983 1726883063.25723: exiting _queue_task() for managed_node2/include_tasks 28983 1726883063.25841: done queuing things up, now waiting for results queue to drain 28983 1726883063.25844: waiting for pending results... 28983 1726883063.26060: running TaskExecutor() for managed_node2/TASK: Include the task 'show_interfaces.yml' 28983 1726883063.26242: in run() - task 0affe814-3a2d-b16d-c0a7-000000001746 28983 1726883063.26249: variable 'ansible_search_path' from source: unknown 28983 1726883063.26255: variable 'ansible_search_path' from source: unknown 28983 1726883063.26280: calling self._execute() 28983 1726883063.26402: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883063.26416: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883063.26476: variable 'omit' from source: magic vars 28983 1726883063.26923: variable 'ansible_distribution_major_version' from source: facts 28983 1726883063.26944: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883063.26955: _execute() done 28983 1726883063.26965: dumping result to json 28983 1726883063.26976: done dumping result, returning 28983 1726883063.27018: done running TaskExecutor() for managed_node2/TASK: Include the task 'show_interfaces.yml' [0affe814-3a2d-b16d-c0a7-000000001746] 28983 1726883063.27021: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001746 28983 1726883063.27276: no more pending results, returning what we have 28983 1726883063.27284: in VariableManager get_vars() 28983 1726883063.27346: Calling all_inventory to load vars for managed_node2 28983 1726883063.27350: Calling groups_inventory to load vars for managed_node2 28983 1726883063.27355: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883063.27371: Calling all_plugins_play to load vars for managed_node2 28983 1726883063.27379: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883063.27383: Calling groups_plugins_play to load vars for managed_node2 28983 1726883063.27951: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001746 28983 1726883063.27955: WORKER PROCESS EXITING 28983 1726883063.29681: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883063.32725: done with get_vars() 28983 1726883063.32770: variable 'ansible_search_path' from source: unknown 28983 1726883063.32774: variable 'ansible_search_path' from source: unknown 28983 1726883063.32827: we have included files to process 28983 1726883063.32828: generating all_blocks data 28983 1726883063.32831: done generating all_blocks data 28983 1726883063.32841: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 28983 1726883063.32842: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 28983 1726883063.32846: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 28983 1726883063.32986: in VariableManager get_vars() 28983 1726883063.33016: done with get_vars() 28983 1726883063.33160: done processing included file 28983 1726883063.33162: iterating over new_blocks loaded from include file 28983 1726883063.33164: in VariableManager get_vars() 28983 1726883063.33187: done with get_vars() 28983 1726883063.33189: filtering new block on tags 28983 1726883063.33237: done filtering new block on tags 28983 1726883063.33240: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node2 28983 1726883063.33246: extending task lists for all hosts with included blocks 28983 1726883063.33894: done extending task lists 28983 1726883063.33895: done processing included files 28983 1726883063.33896: results queue empty 28983 1726883063.33897: checking for any_errors_fatal 28983 1726883063.33907: done checking for any_errors_fatal 28983 1726883063.33908: checking for max_fail_percentage 28983 1726883063.33910: done checking for max_fail_percentage 28983 1726883063.33911: checking to see if all hosts have failed and the running result is not ok 28983 1726883063.33912: done checking to see if all hosts have failed 28983 1726883063.33913: getting the remaining hosts for this loop 28983 1726883063.33914: done getting the remaining hosts for this loop 28983 1726883063.33918: getting the next task for host managed_node2 28983 1726883063.33923: done getting next task for host managed_node2 28983 1726883063.33925: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 28983 1726883063.33929: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883063.33932: getting variables 28983 1726883063.33935: in VariableManager get_vars() 28983 1726883063.33948: Calling all_inventory to load vars for managed_node2 28983 1726883063.33951: Calling groups_inventory to load vars for managed_node2 28983 1726883063.33954: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883063.33961: Calling all_plugins_play to load vars for managed_node2 28983 1726883063.33964: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883063.33968: Calling groups_plugins_play to load vars for managed_node2 28983 1726883063.36052: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883063.38957: done with get_vars() 28983 1726883063.38994: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 21:44:23 -0400 (0:00:00.137) 0:01:33.388 ****** 28983 1726883063.39088: entering _queue_task() for managed_node2/include_tasks 28983 1726883063.39480: worker is 1 (out of 1 available) 28983 1726883063.39493: exiting _queue_task() for managed_node2/include_tasks 28983 1726883063.39506: done queuing things up, now waiting for results queue to drain 28983 1726883063.39508: waiting for pending results... 28983 1726883063.39837: running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' 28983 1726883063.39992: in run() - task 0affe814-3a2d-b16d-c0a7-00000000176d 28983 1726883063.40013: variable 'ansible_search_path' from source: unknown 28983 1726883063.40022: variable 'ansible_search_path' from source: unknown 28983 1726883063.40076: calling self._execute() 28983 1726883063.40197: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883063.40217: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883063.40238: variable 'omit' from source: magic vars 28983 1726883063.40719: variable 'ansible_distribution_major_version' from source: facts 28983 1726883063.40744: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883063.40756: _execute() done 28983 1726883063.40766: dumping result to json 28983 1726883063.40779: done dumping result, returning 28983 1726883063.40794: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' [0affe814-3a2d-b16d-c0a7-00000000176d] 28983 1726883063.40806: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000176d 28983 1726883063.41069: no more pending results, returning what we have 28983 1726883063.41077: in VariableManager get_vars() 28983 1726883063.41130: Calling all_inventory to load vars for managed_node2 28983 1726883063.41136: Calling groups_inventory to load vars for managed_node2 28983 1726883063.41141: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883063.41156: Calling all_plugins_play to load vars for managed_node2 28983 1726883063.41161: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883063.41166: Calling groups_plugins_play to load vars for managed_node2 28983 1726883063.41751: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000176d 28983 1726883063.41755: WORKER PROCESS EXITING 28983 1726883063.43681: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883063.46652: done with get_vars() 28983 1726883063.46686: variable 'ansible_search_path' from source: unknown 28983 1726883063.46688: variable 'ansible_search_path' from source: unknown 28983 1726883063.46732: we have included files to process 28983 1726883063.46733: generating all_blocks data 28983 1726883063.46738: done generating all_blocks data 28983 1726883063.46740: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 28983 1726883063.46741: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 28983 1726883063.46744: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 28983 1726883063.47074: done processing included file 28983 1726883063.47076: iterating over new_blocks loaded from include file 28983 1726883063.47078: in VariableManager get_vars() 28983 1726883063.47099: done with get_vars() 28983 1726883063.47101: filtering new block on tags 28983 1726883063.47152: done filtering new block on tags 28983 1726883063.47155: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node2 28983 1726883063.47161: extending task lists for all hosts with included blocks 28983 1726883063.47387: done extending task lists 28983 1726883063.47389: done processing included files 28983 1726883063.47390: results queue empty 28983 1726883063.47391: checking for any_errors_fatal 28983 1726883063.47395: done checking for any_errors_fatal 28983 1726883063.47396: checking for max_fail_percentage 28983 1726883063.47397: done checking for max_fail_percentage 28983 1726883063.47399: checking to see if all hosts have failed and the running result is not ok 28983 1726883063.47400: done checking to see if all hosts have failed 28983 1726883063.47401: getting the remaining hosts for this loop 28983 1726883063.47402: done getting the remaining hosts for this loop 28983 1726883063.47405: getting the next task for host managed_node2 28983 1726883063.47411: done getting next task for host managed_node2 28983 1726883063.47413: ^ task is: TASK: Gather current interface info 28983 1726883063.47418: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883063.47421: getting variables 28983 1726883063.47422: in VariableManager get_vars() 28983 1726883063.47436: Calling all_inventory to load vars for managed_node2 28983 1726883063.47439: Calling groups_inventory to load vars for managed_node2 28983 1726883063.47442: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883063.47448: Calling all_plugins_play to load vars for managed_node2 28983 1726883063.47451: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883063.47455: Calling groups_plugins_play to load vars for managed_node2 28983 1726883063.49513: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883063.52645: done with get_vars() 28983 1726883063.52689: done getting variables 28983 1726883063.52747: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 21:44:23 -0400 (0:00:00.136) 0:01:33.525 ****** 28983 1726883063.52791: entering _queue_task() for managed_node2/command 28983 1726883063.53204: worker is 1 (out of 1 available) 28983 1726883063.53219: exiting _queue_task() for managed_node2/command 28983 1726883063.53238: done queuing things up, now waiting for results queue to drain 28983 1726883063.53241: waiting for pending results... 28983 1726883063.53564: running TaskExecutor() for managed_node2/TASK: Gather current interface info 28983 1726883063.53748: in run() - task 0affe814-3a2d-b16d-c0a7-0000000017a8 28983 1726883063.53779: variable 'ansible_search_path' from source: unknown 28983 1726883063.53789: variable 'ansible_search_path' from source: unknown 28983 1726883063.53838: calling self._execute() 28983 1726883063.53960: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883063.53984: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883063.54005: variable 'omit' from source: magic vars 28983 1726883063.54542: variable 'ansible_distribution_major_version' from source: facts 28983 1726883063.54545: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883063.54548: variable 'omit' from source: magic vars 28983 1726883063.54761: variable 'omit' from source: magic vars 28983 1726883063.54764: variable 'omit' from source: magic vars 28983 1726883063.54806: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883063.55041: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883063.55045: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883063.55055: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883063.55077: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883063.55124: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883063.55207: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883063.55218: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883063.55465: Set connection var ansible_connection to ssh 28983 1726883063.55544: Set connection var ansible_shell_executable to /bin/sh 28983 1726883063.55564: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883063.55585: Set connection var ansible_timeout to 10 28983 1726883063.55598: Set connection var ansible_pipelining to False 28983 1726883063.55660: Set connection var ansible_shell_type to sh 28983 1726883063.55691: variable 'ansible_shell_executable' from source: unknown 28983 1726883063.55942: variable 'ansible_connection' from source: unknown 28983 1726883063.55946: variable 'ansible_module_compression' from source: unknown 28983 1726883063.55949: variable 'ansible_shell_type' from source: unknown 28983 1726883063.55952: variable 'ansible_shell_executable' from source: unknown 28983 1726883063.55954: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883063.55956: variable 'ansible_pipelining' from source: unknown 28983 1726883063.55959: variable 'ansible_timeout' from source: unknown 28983 1726883063.55961: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883063.56199: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883063.56217: variable 'omit' from source: magic vars 28983 1726883063.56227: starting attempt loop 28983 1726883063.56236: running the handler 28983 1726883063.56260: _low_level_execute_command(): starting 28983 1726883063.56379: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726883063.57926: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883063.58027: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883063.58140: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883063.58279: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883063.60058: stdout chunk (state=3): >>>/root <<< 28983 1726883063.60228: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883063.60236: stdout chunk (state=3): >>><<< 28983 1726883063.60247: stderr chunk (state=3): >>><<< 28983 1726883063.60297: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883063.60312: _low_level_execute_command(): starting 28983 1726883063.60318: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726883063.6029665-32480-12513611009032 `" && echo ansible-tmp-1726883063.6029665-32480-12513611009032="` echo /root/.ansible/tmp/ansible-tmp-1726883063.6029665-32480-12513611009032 `" ) && sleep 0' 28983 1726883063.61049: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883063.61129: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883063.61145: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883063.61180: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883063.61260: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883063.63367: stdout chunk (state=3): >>>ansible-tmp-1726883063.6029665-32480-12513611009032=/root/.ansible/tmp/ansible-tmp-1726883063.6029665-32480-12513611009032 <<< 28983 1726883063.63575: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883063.63578: stdout chunk (state=3): >>><<< 28983 1726883063.63581: stderr chunk (state=3): >>><<< 28983 1726883063.63607: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726883063.6029665-32480-12513611009032=/root/.ansible/tmp/ansible-tmp-1726883063.6029665-32480-12513611009032 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883063.63661: variable 'ansible_module_compression' from source: unknown 28983 1726883063.63782: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 28983 1726883063.63787: variable 'ansible_facts' from source: unknown 28983 1726883063.63892: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726883063.6029665-32480-12513611009032/AnsiballZ_command.py 28983 1726883063.64186: Sending initial data 28983 1726883063.64189: Sent initial data (155 bytes) 28983 1726883063.64794: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883063.64891: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883063.64929: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883063.65011: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883063.66715: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 28983 1726883063.66719: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726883063.66781: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726883063.66877: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmppq954ytu /root/.ansible/tmp/ansible-tmp-1726883063.6029665-32480-12513611009032/AnsiballZ_command.py <<< 28983 1726883063.66881: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726883063.6029665-32480-12513611009032/AnsiballZ_command.py" <<< 28983 1726883063.66938: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmppq954ytu" to remote "/root/.ansible/tmp/ansible-tmp-1726883063.6029665-32480-12513611009032/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726883063.6029665-32480-12513611009032/AnsiballZ_command.py" <<< 28983 1726883063.68239: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883063.68242: stderr chunk (state=3): >>><<< 28983 1726883063.68245: stdout chunk (state=3): >>><<< 28983 1726883063.68247: done transferring module to remote 28983 1726883063.68258: _low_level_execute_command(): starting 28983 1726883063.68264: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726883063.6029665-32480-12513611009032/ /root/.ansible/tmp/ansible-tmp-1726883063.6029665-32480-12513611009032/AnsiballZ_command.py && sleep 0' 28983 1726883063.68821: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883063.68831: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883063.68939: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883063.68943: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883063.68946: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726883063.68949: stderr chunk (state=3): >>>debug2: match not found <<< 28983 1726883063.68951: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883063.68954: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28983 1726883063.68956: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.46.139 is address <<< 28983 1726883063.68959: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28983 1726883063.68961: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883063.68964: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883063.68966: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883063.68968: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726883063.68970: stderr chunk (state=3): >>>debug2: match found <<< 28983 1726883063.68975: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883063.69060: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883063.69063: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883063.69084: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883063.69184: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883063.71259: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883063.71284: stderr chunk (state=3): >>><<< 28983 1726883063.71294: stdout chunk (state=3): >>><<< 28983 1726883063.71327: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883063.71339: _low_level_execute_command(): starting 28983 1726883063.71353: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726883063.6029665-32480-12513611009032/AnsiballZ_command.py && sleep 0' 28983 1726883063.71988: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883063.72005: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883063.72021: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883063.72054: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883063.72068: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883063.72149: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883063.72230: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883063.90087: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:44:23.895600", "end": "2024-09-20 21:44:23.899492", "delta": "0:00:00.003892", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 28983 1726883063.91794: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. <<< 28983 1726883063.91872: stderr chunk (state=3): >>><<< 28983 1726883063.91876: stdout chunk (state=3): >>><<< 28983 1726883063.92045: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:44:23.895600", "end": "2024-09-20 21:44:23.899492", "delta": "0:00:00.003892", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726883063.92049: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726883063.6029665-32480-12513611009032/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726883063.92052: _low_level_execute_command(): starting 28983 1726883063.92055: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726883063.6029665-32480-12513611009032/ > /dev/null 2>&1 && sleep 0' 28983 1726883063.92617: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883063.92622: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883063.92636: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883063.92654: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883063.92668: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726883063.92680: stderr chunk (state=3): >>>debug2: match not found <<< 28983 1726883063.92695: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883063.92717: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28983 1726883063.92823: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883063.92831: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883063.92836: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883063.92909: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883063.92985: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883063.95064: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883063.95068: stdout chunk (state=3): >>><<< 28983 1726883063.95071: stderr chunk (state=3): >>><<< 28983 1726883063.95139: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883063.95143: handler run complete 28983 1726883063.95151: Evaluated conditional (False): False 28983 1726883063.95178: attempt loop complete, returning result 28983 1726883063.95191: _execute() done 28983 1726883063.95257: dumping result to json 28983 1726883063.95260: done dumping result, returning 28983 1726883063.95263: done running TaskExecutor() for managed_node2/TASK: Gather current interface info [0affe814-3a2d-b16d-c0a7-0000000017a8] 28983 1726883063.95266: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000017a8 ok: [managed_node2] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003892", "end": "2024-09-20 21:44:23.899492", "rc": 0, "start": "2024-09-20 21:44:23.895600" } STDOUT: bonding_masters eth0 lo 28983 1726883063.95626: no more pending results, returning what we have 28983 1726883063.95631: results queue empty 28983 1726883063.95632: checking for any_errors_fatal 28983 1726883063.95664: done checking for any_errors_fatal 28983 1726883063.95666: checking for max_fail_percentage 28983 1726883063.95668: done checking for max_fail_percentage 28983 1726883063.95670: checking to see if all hosts have failed and the running result is not ok 28983 1726883063.95671: done checking to see if all hosts have failed 28983 1726883063.95672: getting the remaining hosts for this loop 28983 1726883063.95674: done getting the remaining hosts for this loop 28983 1726883063.95679: getting the next task for host managed_node2 28983 1726883063.95691: done getting next task for host managed_node2 28983 1726883063.95694: ^ task is: TASK: Set current_interfaces 28983 1726883063.95702: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883063.95757: getting variables 28983 1726883063.95759: in VariableManager get_vars() 28983 1726883063.95902: Calling all_inventory to load vars for managed_node2 28983 1726883063.95906: Calling groups_inventory to load vars for managed_node2 28983 1726883063.95910: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883063.95927: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000017a8 28983 1726883063.95930: WORKER PROCESS EXITING 28983 1726883063.95948: Calling all_plugins_play to load vars for managed_node2 28983 1726883063.95952: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883063.95957: Calling groups_plugins_play to load vars for managed_node2 28983 1726883063.98952: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883064.01450: done with get_vars() 28983 1726883064.01477: done getting variables 28983 1726883064.01527: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 21:44:24 -0400 (0:00:00.487) 0:01:34.013 ****** 28983 1726883064.01557: entering _queue_task() for managed_node2/set_fact 28983 1726883064.01814: worker is 1 (out of 1 available) 28983 1726883064.01830: exiting _queue_task() for managed_node2/set_fact 28983 1726883064.01846: done queuing things up, now waiting for results queue to drain 28983 1726883064.01849: waiting for pending results... 28983 1726883064.02041: running TaskExecutor() for managed_node2/TASK: Set current_interfaces 28983 1726883064.02140: in run() - task 0affe814-3a2d-b16d-c0a7-0000000017a9 28983 1726883064.02154: variable 'ansible_search_path' from source: unknown 28983 1726883064.02158: variable 'ansible_search_path' from source: unknown 28983 1726883064.02196: calling self._execute() 28983 1726883064.02280: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883064.02296: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883064.02301: variable 'omit' from source: magic vars 28983 1726883064.02834: variable 'ansible_distribution_major_version' from source: facts 28983 1726883064.02839: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883064.02844: variable 'omit' from source: magic vars 28983 1726883064.02887: variable 'omit' from source: magic vars 28983 1726883064.03031: variable '_current_interfaces' from source: set_fact 28983 1726883064.03121: variable 'omit' from source: magic vars 28983 1726883064.03191: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883064.03242: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883064.03276: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883064.03432: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883064.03437: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883064.03440: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883064.03443: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883064.03445: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883064.03530: Set connection var ansible_connection to ssh 28983 1726883064.03552: Set connection var ansible_shell_executable to /bin/sh 28983 1726883064.03576: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883064.03587: Set connection var ansible_timeout to 10 28983 1726883064.03594: Set connection var ansible_pipelining to False 28983 1726883064.03597: Set connection var ansible_shell_type to sh 28983 1726883064.03615: variable 'ansible_shell_executable' from source: unknown 28983 1726883064.03619: variable 'ansible_connection' from source: unknown 28983 1726883064.03623: variable 'ansible_module_compression' from source: unknown 28983 1726883064.03626: variable 'ansible_shell_type' from source: unknown 28983 1726883064.03630: variable 'ansible_shell_executable' from source: unknown 28983 1726883064.03633: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883064.03641: variable 'ansible_pipelining' from source: unknown 28983 1726883064.03643: variable 'ansible_timeout' from source: unknown 28983 1726883064.03649: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883064.03789: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883064.03794: variable 'omit' from source: magic vars 28983 1726883064.03799: starting attempt loop 28983 1726883064.03802: running the handler 28983 1726883064.03815: handler run complete 28983 1726883064.03824: attempt loop complete, returning result 28983 1726883064.03826: _execute() done 28983 1726883064.03830: dumping result to json 28983 1726883064.03837: done dumping result, returning 28983 1726883064.03844: done running TaskExecutor() for managed_node2/TASK: Set current_interfaces [0affe814-3a2d-b16d-c0a7-0000000017a9] 28983 1726883064.03849: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000017a9 ok: [managed_node2] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 28983 1726883064.04019: no more pending results, returning what we have 28983 1726883064.04023: results queue empty 28983 1726883064.04024: checking for any_errors_fatal 28983 1726883064.04032: done checking for any_errors_fatal 28983 1726883064.04033: checking for max_fail_percentage 28983 1726883064.04037: done checking for max_fail_percentage 28983 1726883064.04038: checking to see if all hosts have failed and the running result is not ok 28983 1726883064.04038: done checking to see if all hosts have failed 28983 1726883064.04039: getting the remaining hosts for this loop 28983 1726883064.04041: done getting the remaining hosts for this loop 28983 1726883064.04045: getting the next task for host managed_node2 28983 1726883064.04053: done getting next task for host managed_node2 28983 1726883064.04055: ^ task is: TASK: Show current_interfaces 28983 1726883064.04060: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883064.04064: getting variables 28983 1726883064.04065: in VariableManager get_vars() 28983 1726883064.04101: Calling all_inventory to load vars for managed_node2 28983 1726883064.04104: Calling groups_inventory to load vars for managed_node2 28983 1726883064.04107: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883064.04116: Calling all_plugins_play to load vars for managed_node2 28983 1726883064.04119: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883064.04123: Calling groups_plugins_play to load vars for managed_node2 28983 1726883064.04649: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000017a9 28983 1726883064.04653: WORKER PROCESS EXITING 28983 1726883064.05490: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883064.08298: done with get_vars() 28983 1726883064.08344: done getting variables 28983 1726883064.08414: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 21:44:24 -0400 (0:00:00.068) 0:01:34.082 ****** 28983 1726883064.08455: entering _queue_task() for managed_node2/debug 28983 1726883064.08805: worker is 1 (out of 1 available) 28983 1726883064.08819: exiting _queue_task() for managed_node2/debug 28983 1726883064.08832: done queuing things up, now waiting for results queue to drain 28983 1726883064.09038: waiting for pending results... 28983 1726883064.09163: running TaskExecutor() for managed_node2/TASK: Show current_interfaces 28983 1726883064.09343: in run() - task 0affe814-3a2d-b16d-c0a7-00000000176e 28983 1726883064.09351: variable 'ansible_search_path' from source: unknown 28983 1726883064.09354: variable 'ansible_search_path' from source: unknown 28983 1726883064.09368: calling self._execute() 28983 1726883064.09442: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883064.09449: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883064.09463: variable 'omit' from source: magic vars 28983 1726883064.09921: variable 'ansible_distribution_major_version' from source: facts 28983 1726883064.09940: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883064.09949: variable 'omit' from source: magic vars 28983 1726883064.10021: variable 'omit' from source: magic vars 28983 1726883064.10150: variable 'current_interfaces' from source: set_fact 28983 1726883064.10170: variable 'omit' from source: magic vars 28983 1726883064.10219: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883064.10266: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883064.10290: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883064.10312: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883064.10323: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883064.10366: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883064.10370: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883064.10378: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883064.10498: Set connection var ansible_connection to ssh 28983 1726883064.10512: Set connection var ansible_shell_executable to /bin/sh 28983 1726883064.10523: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883064.10537: Set connection var ansible_timeout to 10 28983 1726883064.10544: Set connection var ansible_pipelining to False 28983 1726883064.10547: Set connection var ansible_shell_type to sh 28983 1726883064.10582: variable 'ansible_shell_executable' from source: unknown 28983 1726883064.10586: variable 'ansible_connection' from source: unknown 28983 1726883064.10637: variable 'ansible_module_compression' from source: unknown 28983 1726883064.10642: variable 'ansible_shell_type' from source: unknown 28983 1726883064.10645: variable 'ansible_shell_executable' from source: unknown 28983 1726883064.10648: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883064.10650: variable 'ansible_pipelining' from source: unknown 28983 1726883064.10653: variable 'ansible_timeout' from source: unknown 28983 1726883064.10655: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883064.10775: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883064.10787: variable 'omit' from source: magic vars 28983 1726883064.10799: starting attempt loop 28983 1726883064.10803: running the handler 28983 1726883064.10858: handler run complete 28983 1726883064.10877: attempt loop complete, returning result 28983 1726883064.10881: _execute() done 28983 1726883064.10884: dumping result to json 28983 1726883064.10908: done dumping result, returning 28983 1726883064.10912: done running TaskExecutor() for managed_node2/TASK: Show current_interfaces [0affe814-3a2d-b16d-c0a7-00000000176e] 28983 1726883064.11037: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000176e 28983 1726883064.11108: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000176e 28983 1726883064.11112: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 28983 1726883064.11164: no more pending results, returning what we have 28983 1726883064.11167: results queue empty 28983 1726883064.11168: checking for any_errors_fatal 28983 1726883064.11174: done checking for any_errors_fatal 28983 1726883064.11175: checking for max_fail_percentage 28983 1726883064.11177: done checking for max_fail_percentage 28983 1726883064.11178: checking to see if all hosts have failed and the running result is not ok 28983 1726883064.11179: done checking to see if all hosts have failed 28983 1726883064.11180: getting the remaining hosts for this loop 28983 1726883064.11182: done getting the remaining hosts for this loop 28983 1726883064.11187: getting the next task for host managed_node2 28983 1726883064.11196: done getting next task for host managed_node2 28983 1726883064.11200: ^ task is: TASK: Setup 28983 1726883064.11204: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883064.11208: getting variables 28983 1726883064.11210: in VariableManager get_vars() 28983 1726883064.11250: Calling all_inventory to load vars for managed_node2 28983 1726883064.11254: Calling groups_inventory to load vars for managed_node2 28983 1726883064.11259: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883064.11270: Calling all_plugins_play to load vars for managed_node2 28983 1726883064.11275: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883064.11279: Calling groups_plugins_play to load vars for managed_node2 28983 1726883064.13541: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883064.16451: done with get_vars() 28983 1726883064.16488: done getting variables TASK [Setup] ******************************************************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:24 Friday 20 September 2024 21:44:24 -0400 (0:00:00.081) 0:01:34.163 ****** 28983 1726883064.16598: entering _queue_task() for managed_node2/include_tasks 28983 1726883064.16916: worker is 1 (out of 1 available) 28983 1726883064.16930: exiting _queue_task() for managed_node2/include_tasks 28983 1726883064.16947: done queuing things up, now waiting for results queue to drain 28983 1726883064.16949: waiting for pending results... 28983 1726883064.17357: running TaskExecutor() for managed_node2/TASK: Setup 28983 1726883064.17443: in run() - task 0affe814-3a2d-b16d-c0a7-000000001747 28983 1726883064.17456: variable 'ansible_search_path' from source: unknown 28983 1726883064.17460: variable 'ansible_search_path' from source: unknown 28983 1726883064.17463: variable 'lsr_setup' from source: include params 28983 1726883064.17679: variable 'lsr_setup' from source: include params 28983 1726883064.17761: variable 'omit' from source: magic vars 28983 1726883064.17923: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883064.17936: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883064.17951: variable 'omit' from source: magic vars 28983 1726883064.18258: variable 'ansible_distribution_major_version' from source: facts 28983 1726883064.18269: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883064.18277: variable 'item' from source: unknown 28983 1726883064.18357: variable 'item' from source: unknown 28983 1726883064.18427: variable 'item' from source: unknown 28983 1726883064.18476: variable 'item' from source: unknown 28983 1726883064.18856: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883064.18861: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883064.18864: variable 'omit' from source: magic vars 28983 1726883064.18867: variable 'ansible_distribution_major_version' from source: facts 28983 1726883064.18869: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883064.18874: variable 'item' from source: unknown 28983 1726883064.18877: variable 'item' from source: unknown 28983 1726883064.18917: variable 'item' from source: unknown 28983 1726883064.18995: variable 'item' from source: unknown 28983 1726883064.19080: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883064.19186: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883064.19190: variable 'omit' from source: magic vars 28983 1726883064.19300: variable 'ansible_distribution_major_version' from source: facts 28983 1726883064.19312: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883064.19318: variable 'item' from source: unknown 28983 1726883064.19402: variable 'item' from source: unknown 28983 1726883064.19431: variable 'item' from source: unknown 28983 1726883064.19512: variable 'item' from source: unknown 28983 1726883064.19577: dumping result to json 28983 1726883064.19581: done dumping result, returning 28983 1726883064.19584: done running TaskExecutor() for managed_node2/TASK: Setup [0affe814-3a2d-b16d-c0a7-000000001747] 28983 1726883064.19588: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001747 28983 1726883064.19788: no more pending results, returning what we have 28983 1726883064.19793: in VariableManager get_vars() 28983 1726883064.19836: Calling all_inventory to load vars for managed_node2 28983 1726883064.19840: Calling groups_inventory to load vars for managed_node2 28983 1726883064.19844: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883064.19851: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001747 28983 1726883064.19856: WORKER PROCESS EXITING 28983 1726883064.19866: Calling all_plugins_play to load vars for managed_node2 28983 1726883064.19870: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883064.19875: Calling groups_plugins_play to load vars for managed_node2 28983 1726883064.22247: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883064.25112: done with get_vars() 28983 1726883064.25145: variable 'ansible_search_path' from source: unknown 28983 1726883064.25147: variable 'ansible_search_path' from source: unknown 28983 1726883064.25193: variable 'ansible_search_path' from source: unknown 28983 1726883064.25195: variable 'ansible_search_path' from source: unknown 28983 1726883064.25233: variable 'ansible_search_path' from source: unknown 28983 1726883064.25237: variable 'ansible_search_path' from source: unknown 28983 1726883064.25273: we have included files to process 28983 1726883064.25275: generating all_blocks data 28983 1726883064.25277: done generating all_blocks data 28983 1726883064.25283: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 28983 1726883064.25284: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 28983 1726883064.25287: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 28983 1726883064.25567: done processing included file 28983 1726883064.25569: iterating over new_blocks loaded from include file 28983 1726883064.25571: in VariableManager get_vars() 28983 1726883064.25590: done with get_vars() 28983 1726883064.25592: filtering new block on tags 28983 1726883064.25636: done filtering new block on tags 28983 1726883064.25640: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml for managed_node2 => (item=tasks/create_bridge_profile.yml) 28983 1726883064.25646: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml 28983 1726883064.25647: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml 28983 1726883064.25650: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml 28983 1726883064.25770: done processing included file 28983 1726883064.25773: iterating over new_blocks loaded from include file 28983 1726883064.25774: in VariableManager get_vars() 28983 1726883064.25792: done with get_vars() 28983 1726883064.25794: filtering new block on tags 28983 1726883064.25821: done filtering new block on tags 28983 1726883064.25823: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml for managed_node2 => (item=tasks/activate_profile.yml) 28983 1726883064.25827: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_profile.yml 28983 1726883064.25829: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_profile.yml 28983 1726883064.25832: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_profile.yml 28983 1726883064.25947: done processing included file 28983 1726883064.25949: iterating over new_blocks loaded from include file 28983 1726883064.25951: in VariableManager get_vars() 28983 1726883064.25968: done with get_vars() 28983 1726883064.25970: filtering new block on tags 28983 1726883064.25997: done filtering new block on tags 28983 1726883064.26000: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_profile.yml for managed_node2 => (item=tasks/remove_profile.yml) 28983 1726883064.26004: extending task lists for all hosts with included blocks 28983 1726883064.26907: done extending task lists 28983 1726883064.26914: done processing included files 28983 1726883064.26915: results queue empty 28983 1726883064.26916: checking for any_errors_fatal 28983 1726883064.26920: done checking for any_errors_fatal 28983 1726883064.26921: checking for max_fail_percentage 28983 1726883064.26922: done checking for max_fail_percentage 28983 1726883064.26924: checking to see if all hosts have failed and the running result is not ok 28983 1726883064.26925: done checking to see if all hosts have failed 28983 1726883064.26925: getting the remaining hosts for this loop 28983 1726883064.26927: done getting the remaining hosts for this loop 28983 1726883064.26930: getting the next task for host managed_node2 28983 1726883064.26938: done getting next task for host managed_node2 28983 1726883064.26940: ^ task is: TASK: Include network role 28983 1726883064.26943: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883064.26946: getting variables 28983 1726883064.26948: in VariableManager get_vars() 28983 1726883064.26959: Calling all_inventory to load vars for managed_node2 28983 1726883064.26962: Calling groups_inventory to load vars for managed_node2 28983 1726883064.26965: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883064.26971: Calling all_plugins_play to load vars for managed_node2 28983 1726883064.26973: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883064.26977: Calling groups_plugins_play to load vars for managed_node2 28983 1726883064.28958: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883064.31825: done with get_vars() 28983 1726883064.31860: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml:3 Friday 20 September 2024 21:44:24 -0400 (0:00:00.153) 0:01:34.317 ****** 28983 1726883064.31949: entering _queue_task() for managed_node2/include_role 28983 1726883064.32290: worker is 1 (out of 1 available) 28983 1726883064.32303: exiting _queue_task() for managed_node2/include_role 28983 1726883064.32318: done queuing things up, now waiting for results queue to drain 28983 1726883064.32320: waiting for pending results... 28983 1726883064.32626: running TaskExecutor() for managed_node2/TASK: Include network role 28983 1726883064.32842: in run() - task 0affe814-3a2d-b16d-c0a7-0000000017d0 28983 1726883064.32847: variable 'ansible_search_path' from source: unknown 28983 1726883064.32850: variable 'ansible_search_path' from source: unknown 28983 1726883064.32853: calling self._execute() 28983 1726883064.32941: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883064.32949: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883064.32962: variable 'omit' from source: magic vars 28983 1726883064.33397: variable 'ansible_distribution_major_version' from source: facts 28983 1726883064.33442: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883064.33450: _execute() done 28983 1726883064.33454: dumping result to json 28983 1726883064.33456: done dumping result, returning 28983 1726883064.33459: done running TaskExecutor() for managed_node2/TASK: Include network role [0affe814-3a2d-b16d-c0a7-0000000017d0] 28983 1726883064.33462: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000017d0 28983 1726883064.33630: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000017d0 28983 1726883064.33636: WORKER PROCESS EXITING 28983 1726883064.33666: no more pending results, returning what we have 28983 1726883064.33672: in VariableManager get_vars() 28983 1726883064.33720: Calling all_inventory to load vars for managed_node2 28983 1726883064.33723: Calling groups_inventory to load vars for managed_node2 28983 1726883064.33727: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883064.33742: Calling all_plugins_play to load vars for managed_node2 28983 1726883064.33746: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883064.33751: Calling groups_plugins_play to load vars for managed_node2 28983 1726883064.35982: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883064.38878: done with get_vars() 28983 1726883064.38914: variable 'ansible_search_path' from source: unknown 28983 1726883064.38916: variable 'ansible_search_path' from source: unknown 28983 1726883064.39173: variable 'omit' from source: magic vars 28983 1726883064.39230: variable 'omit' from source: magic vars 28983 1726883064.39252: variable 'omit' from source: magic vars 28983 1726883064.39257: we have included files to process 28983 1726883064.39258: generating all_blocks data 28983 1726883064.39260: done generating all_blocks data 28983 1726883064.39262: processing included file: fedora.linux_system_roles.network 28983 1726883064.39288: in VariableManager get_vars() 28983 1726883064.39305: done with get_vars() 28983 1726883064.39340: in VariableManager get_vars() 28983 1726883064.39362: done with get_vars() 28983 1726883064.39407: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 28983 1726883064.39576: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 28983 1726883064.39688: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 28983 1726883064.40430: in VariableManager get_vars() 28983 1726883064.40459: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 28983 1726883064.42964: iterating over new_blocks loaded from include file 28983 1726883064.42967: in VariableManager get_vars() 28983 1726883064.42990: done with get_vars() 28983 1726883064.42992: filtering new block on tags 28983 1726883064.43408: done filtering new block on tags 28983 1726883064.43413: in VariableManager get_vars() 28983 1726883064.43437: done with get_vars() 28983 1726883064.43439: filtering new block on tags 28983 1726883064.43461: done filtering new block on tags 28983 1726883064.43464: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node2 28983 1726883064.43470: extending task lists for all hosts with included blocks 28983 1726883064.43690: done extending task lists 28983 1726883064.43691: done processing included files 28983 1726883064.43692: results queue empty 28983 1726883064.43693: checking for any_errors_fatal 28983 1726883064.43698: done checking for any_errors_fatal 28983 1726883064.43699: checking for max_fail_percentage 28983 1726883064.43700: done checking for max_fail_percentage 28983 1726883064.43701: checking to see if all hosts have failed and the running result is not ok 28983 1726883064.43702: done checking to see if all hosts have failed 28983 1726883064.43703: getting the remaining hosts for this loop 28983 1726883064.43705: done getting the remaining hosts for this loop 28983 1726883064.43708: getting the next task for host managed_node2 28983 1726883064.43713: done getting next task for host managed_node2 28983 1726883064.43717: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 28983 1726883064.43721: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883064.43735: getting variables 28983 1726883064.43736: in VariableManager get_vars() 28983 1726883064.43752: Calling all_inventory to load vars for managed_node2 28983 1726883064.43755: Calling groups_inventory to load vars for managed_node2 28983 1726883064.43758: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883064.43765: Calling all_plugins_play to load vars for managed_node2 28983 1726883064.43768: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883064.43771: Calling groups_plugins_play to load vars for managed_node2 28983 1726883064.45796: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883064.48652: done with get_vars() 28983 1726883064.48692: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:44:24 -0400 (0:00:00.168) 0:01:34.485 ****** 28983 1726883064.48791: entering _queue_task() for managed_node2/include_tasks 28983 1726883064.49192: worker is 1 (out of 1 available) 28983 1726883064.49205: exiting _queue_task() for managed_node2/include_tasks 28983 1726883064.49218: done queuing things up, now waiting for results queue to drain 28983 1726883064.49220: waiting for pending results... 28983 1726883064.49876: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 28983 1726883064.49883: in run() - task 0affe814-3a2d-b16d-c0a7-00000000183a 28983 1726883064.49887: variable 'ansible_search_path' from source: unknown 28983 1726883064.49890: variable 'ansible_search_path' from source: unknown 28983 1726883064.49892: calling self._execute() 28983 1726883064.49895: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883064.49898: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883064.49903: variable 'omit' from source: magic vars 28983 1726883064.50379: variable 'ansible_distribution_major_version' from source: facts 28983 1726883064.50397: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883064.50401: _execute() done 28983 1726883064.50414: dumping result to json 28983 1726883064.50417: done dumping result, returning 28983 1726883064.50420: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affe814-3a2d-b16d-c0a7-00000000183a] 28983 1726883064.50427: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000183a 28983 1726883064.50608: no more pending results, returning what we have 28983 1726883064.50614: in VariableManager get_vars() 28983 1726883064.50677: Calling all_inventory to load vars for managed_node2 28983 1726883064.50681: Calling groups_inventory to load vars for managed_node2 28983 1726883064.50684: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883064.50699: Calling all_plugins_play to load vars for managed_node2 28983 1726883064.50704: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883064.50709: Calling groups_plugins_play to load vars for managed_node2 28983 1726883064.51229: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000183a 28983 1726883064.51234: WORKER PROCESS EXITING 28983 1726883064.57581: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883064.59694: done with get_vars() 28983 1726883064.59727: variable 'ansible_search_path' from source: unknown 28983 1726883064.59729: variable 'ansible_search_path' from source: unknown 28983 1726883064.59778: we have included files to process 28983 1726883064.59780: generating all_blocks data 28983 1726883064.59782: done generating all_blocks data 28983 1726883064.59784: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 28983 1726883064.59786: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 28983 1726883064.59788: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 28983 1726883064.60501: done processing included file 28983 1726883064.60503: iterating over new_blocks loaded from include file 28983 1726883064.60505: in VariableManager get_vars() 28983 1726883064.60537: done with get_vars() 28983 1726883064.60540: filtering new block on tags 28983 1726883064.60580: done filtering new block on tags 28983 1726883064.60583: in VariableManager get_vars() 28983 1726883064.60613: done with get_vars() 28983 1726883064.60615: filtering new block on tags 28983 1726883064.60678: done filtering new block on tags 28983 1726883064.60682: in VariableManager get_vars() 28983 1726883064.60716: done with get_vars() 28983 1726883064.60719: filtering new block on tags 28983 1726883064.60787: done filtering new block on tags 28983 1726883064.60790: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node2 28983 1726883064.60796: extending task lists for all hosts with included blocks 28983 1726883064.63756: done extending task lists 28983 1726883064.63758: done processing included files 28983 1726883064.63759: results queue empty 28983 1726883064.63760: checking for any_errors_fatal 28983 1726883064.63763: done checking for any_errors_fatal 28983 1726883064.63764: checking for max_fail_percentage 28983 1726883064.63766: done checking for max_fail_percentage 28983 1726883064.63767: checking to see if all hosts have failed and the running result is not ok 28983 1726883064.63768: done checking to see if all hosts have failed 28983 1726883064.63769: getting the remaining hosts for this loop 28983 1726883064.63770: done getting the remaining hosts for this loop 28983 1726883064.63774: getting the next task for host managed_node2 28983 1726883064.63780: done getting next task for host managed_node2 28983 1726883064.63783: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 28983 1726883064.63787: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883064.63800: getting variables 28983 1726883064.63801: in VariableManager get_vars() 28983 1726883064.63818: Calling all_inventory to load vars for managed_node2 28983 1726883064.63821: Calling groups_inventory to load vars for managed_node2 28983 1726883064.63825: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883064.63831: Calling all_plugins_play to load vars for managed_node2 28983 1726883064.63838: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883064.63843: Calling groups_plugins_play to load vars for managed_node2 28983 1726883064.65807: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883064.68690: done with get_vars() 28983 1726883064.68728: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:44:24 -0400 (0:00:00.200) 0:01:34.686 ****** 28983 1726883064.68826: entering _queue_task() for managed_node2/setup 28983 1726883064.69232: worker is 1 (out of 1 available) 28983 1726883064.69248: exiting _queue_task() for managed_node2/setup 28983 1726883064.69262: done queuing things up, now waiting for results queue to drain 28983 1726883064.69264: waiting for pending results... 28983 1726883064.69688: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 28983 1726883064.69998: in run() - task 0affe814-3a2d-b16d-c0a7-000000001897 28983 1726883064.70002: variable 'ansible_search_path' from source: unknown 28983 1726883064.70008: variable 'ansible_search_path' from source: unknown 28983 1726883064.70013: calling self._execute() 28983 1726883064.70084: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883064.70092: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883064.70120: variable 'omit' from source: magic vars 28983 1726883064.70617: variable 'ansible_distribution_major_version' from source: facts 28983 1726883064.70631: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883064.70953: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883064.73095: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883064.73156: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883064.73190: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883064.73224: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883064.73249: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883064.73322: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883064.73348: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883064.73369: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883064.73410: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883064.73423: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883064.73469: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883064.73491: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883064.73516: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883064.73549: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883064.73561: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883064.73700: variable '__network_required_facts' from source: role '' defaults 28983 1726883064.73709: variable 'ansible_facts' from source: unknown 28983 1726883064.74429: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 28983 1726883064.74435: when evaluation is False, skipping this task 28983 1726883064.74439: _execute() done 28983 1726883064.74441: dumping result to json 28983 1726883064.74445: done dumping result, returning 28983 1726883064.74452: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affe814-3a2d-b16d-c0a7-000000001897] 28983 1726883064.74458: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001897 skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28983 1726883064.74609: no more pending results, returning what we have 28983 1726883064.74613: results queue empty 28983 1726883064.74614: checking for any_errors_fatal 28983 1726883064.74616: done checking for any_errors_fatal 28983 1726883064.74617: checking for max_fail_percentage 28983 1726883064.74619: done checking for max_fail_percentage 28983 1726883064.74620: checking to see if all hosts have failed and the running result is not ok 28983 1726883064.74620: done checking to see if all hosts have failed 28983 1726883064.74621: getting the remaining hosts for this loop 28983 1726883064.74623: done getting the remaining hosts for this loop 28983 1726883064.74628: getting the next task for host managed_node2 28983 1726883064.74643: done getting next task for host managed_node2 28983 1726883064.74653: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 28983 1726883064.74661: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883064.74686: getting variables 28983 1726883064.74687: in VariableManager get_vars() 28983 1726883064.74733: Calling all_inventory to load vars for managed_node2 28983 1726883064.74747: Calling groups_inventory to load vars for managed_node2 28983 1726883064.74750: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883064.74756: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001897 28983 1726883064.74759: WORKER PROCESS EXITING 28983 1726883064.74768: Calling all_plugins_play to load vars for managed_node2 28983 1726883064.74775: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883064.74785: Calling groups_plugins_play to load vars for managed_node2 28983 1726883064.76543: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883064.78306: done with get_vars() 28983 1726883064.78332: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:44:24 -0400 (0:00:00.095) 0:01:34.781 ****** 28983 1726883064.78413: entering _queue_task() for managed_node2/stat 28983 1726883064.78666: worker is 1 (out of 1 available) 28983 1726883064.78684: exiting _queue_task() for managed_node2/stat 28983 1726883064.78697: done queuing things up, now waiting for results queue to drain 28983 1726883064.78699: waiting for pending results... 28983 1726883064.78894: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 28983 1726883064.79035: in run() - task 0affe814-3a2d-b16d-c0a7-000000001899 28983 1726883064.79079: variable 'ansible_search_path' from source: unknown 28983 1726883064.79084: variable 'ansible_search_path' from source: unknown 28983 1726883064.79241: calling self._execute() 28983 1726883064.79245: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883064.79248: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883064.79253: variable 'omit' from source: magic vars 28983 1726883064.79674: variable 'ansible_distribution_major_version' from source: facts 28983 1726883064.79705: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883064.79892: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883064.80318: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883064.80321: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883064.80325: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883064.80327: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883064.80475: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883064.80502: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883064.80539: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883064.80568: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883064.80663: variable '__network_is_ostree' from source: set_fact 28983 1726883064.80678: Evaluated conditional (not __network_is_ostree is defined): False 28983 1726883064.80682: when evaluation is False, skipping this task 28983 1726883064.80684: _execute() done 28983 1726883064.80687: dumping result to json 28983 1726883064.80689: done dumping result, returning 28983 1726883064.80696: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affe814-3a2d-b16d-c0a7-000000001899] 28983 1726883064.80706: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001899 skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 28983 1726883064.80876: no more pending results, returning what we have 28983 1726883064.80881: results queue empty 28983 1726883064.80882: checking for any_errors_fatal 28983 1726883064.80897: done checking for any_errors_fatal 28983 1726883064.80898: checking for max_fail_percentage 28983 1726883064.80900: done checking for max_fail_percentage 28983 1726883064.80901: checking to see if all hosts have failed and the running result is not ok 28983 1726883064.80902: done checking to see if all hosts have failed 28983 1726883064.80903: getting the remaining hosts for this loop 28983 1726883064.80905: done getting the remaining hosts for this loop 28983 1726883064.80910: getting the next task for host managed_node2 28983 1726883064.80922: done getting next task for host managed_node2 28983 1726883064.80927: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 28983 1726883064.80933: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883064.80945: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001899 28983 1726883064.80948: WORKER PROCESS EXITING 28983 1726883064.80962: getting variables 28983 1726883064.80963: in VariableManager get_vars() 28983 1726883064.81012: Calling all_inventory to load vars for managed_node2 28983 1726883064.81015: Calling groups_inventory to load vars for managed_node2 28983 1726883064.81018: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883064.81031: Calling all_plugins_play to load vars for managed_node2 28983 1726883064.81036: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883064.81039: Calling groups_plugins_play to load vars for managed_node2 28983 1726883064.82284: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883064.84041: done with get_vars() 28983 1726883064.84074: done getting variables 28983 1726883064.84138: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:44:24 -0400 (0:00:00.057) 0:01:34.839 ****** 28983 1726883064.84182: entering _queue_task() for managed_node2/set_fact 28983 1726883064.84479: worker is 1 (out of 1 available) 28983 1726883064.84494: exiting _queue_task() for managed_node2/set_fact 28983 1726883064.84509: done queuing things up, now waiting for results queue to drain 28983 1726883064.84511: waiting for pending results... 28983 1726883064.84845: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 28983 1726883064.84989: in run() - task 0affe814-3a2d-b16d-c0a7-00000000189a 28983 1726883064.85021: variable 'ansible_search_path' from source: unknown 28983 1726883064.85031: variable 'ansible_search_path' from source: unknown 28983 1726883064.85061: calling self._execute() 28983 1726883064.85157: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883064.85163: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883064.85177: variable 'omit' from source: magic vars 28983 1726883064.85582: variable 'ansible_distribution_major_version' from source: facts 28983 1726883064.85603: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883064.85802: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883064.86105: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883064.86157: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883064.86232: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883064.86241: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883064.86371: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883064.86403: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883064.86426: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883064.86452: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883064.86546: variable '__network_is_ostree' from source: set_fact 28983 1726883064.86552: Evaluated conditional (not __network_is_ostree is defined): False 28983 1726883064.86560: when evaluation is False, skipping this task 28983 1726883064.86563: _execute() done 28983 1726883064.86578: dumping result to json 28983 1726883064.86581: done dumping result, returning 28983 1726883064.86584: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affe814-3a2d-b16d-c0a7-00000000189a] 28983 1726883064.86587: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000189a 28983 1726883064.86684: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000189a 28983 1726883064.86687: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 28983 1726883064.86745: no more pending results, returning what we have 28983 1726883064.86749: results queue empty 28983 1726883064.86750: checking for any_errors_fatal 28983 1726883064.86755: done checking for any_errors_fatal 28983 1726883064.86756: checking for max_fail_percentage 28983 1726883064.86758: done checking for max_fail_percentage 28983 1726883064.86759: checking to see if all hosts have failed and the running result is not ok 28983 1726883064.86760: done checking to see if all hosts have failed 28983 1726883064.86761: getting the remaining hosts for this loop 28983 1726883064.86763: done getting the remaining hosts for this loop 28983 1726883064.86767: getting the next task for host managed_node2 28983 1726883064.86792: done getting next task for host managed_node2 28983 1726883064.86796: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 28983 1726883064.86802: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883064.86821: getting variables 28983 1726883064.86822: in VariableManager get_vars() 28983 1726883064.86861: Calling all_inventory to load vars for managed_node2 28983 1726883064.86863: Calling groups_inventory to load vars for managed_node2 28983 1726883064.86865: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883064.86874: Calling all_plugins_play to load vars for managed_node2 28983 1726883064.86876: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883064.86879: Calling groups_plugins_play to load vars for managed_node2 28983 1726883064.88282: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883064.90346: done with get_vars() 28983 1726883064.90368: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:44:24 -0400 (0:00:00.062) 0:01:34.902 ****** 28983 1726883064.90449: entering _queue_task() for managed_node2/service_facts 28983 1726883064.90681: worker is 1 (out of 1 available) 28983 1726883064.90695: exiting _queue_task() for managed_node2/service_facts 28983 1726883064.90707: done queuing things up, now waiting for results queue to drain 28983 1726883064.90709: waiting for pending results... 28983 1726883064.90922: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running 28983 1726883064.91070: in run() - task 0affe814-3a2d-b16d-c0a7-00000000189c 28983 1726883064.91088: variable 'ansible_search_path' from source: unknown 28983 1726883064.91092: variable 'ansible_search_path' from source: unknown 28983 1726883064.91122: calling self._execute() 28983 1726883064.91212: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883064.91217: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883064.91229: variable 'omit' from source: magic vars 28983 1726883064.91568: variable 'ansible_distribution_major_version' from source: facts 28983 1726883064.91581: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883064.91587: variable 'omit' from source: magic vars 28983 1726883064.91656: variable 'omit' from source: magic vars 28983 1726883064.91686: variable 'omit' from source: magic vars 28983 1726883064.91726: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883064.91760: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883064.91781: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883064.91797: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883064.91807: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883064.91843: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883064.91847: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883064.91850: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883064.91927: Set connection var ansible_connection to ssh 28983 1726883064.91943: Set connection var ansible_shell_executable to /bin/sh 28983 1726883064.91956: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883064.91963: Set connection var ansible_timeout to 10 28983 1726883064.91969: Set connection var ansible_pipelining to False 28983 1726883064.91972: Set connection var ansible_shell_type to sh 28983 1726883064.91995: variable 'ansible_shell_executable' from source: unknown 28983 1726883064.91998: variable 'ansible_connection' from source: unknown 28983 1726883064.92001: variable 'ansible_module_compression' from source: unknown 28983 1726883064.92004: variable 'ansible_shell_type' from source: unknown 28983 1726883064.92008: variable 'ansible_shell_executable' from source: unknown 28983 1726883064.92012: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883064.92017: variable 'ansible_pipelining' from source: unknown 28983 1726883064.92020: variable 'ansible_timeout' from source: unknown 28983 1726883064.92026: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883064.92198: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28983 1726883064.92209: variable 'omit' from source: magic vars 28983 1726883064.92214: starting attempt loop 28983 1726883064.92217: running the handler 28983 1726883064.92232: _low_level_execute_command(): starting 28983 1726883064.92242: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726883064.92738: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883064.92775: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883064.92780: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883064.92828: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883064.92832: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883064.92916: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883064.94750: stdout chunk (state=3): >>>/root <<< 28983 1726883064.94862: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883064.94916: stderr chunk (state=3): >>><<< 28983 1726883064.94920: stdout chunk (state=3): >>><<< 28983 1726883064.94939: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883064.94956: _low_level_execute_command(): starting 28983 1726883064.94962: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726883064.9494412-32526-218215157438979 `" && echo ansible-tmp-1726883064.9494412-32526-218215157438979="` echo /root/.ansible/tmp/ansible-tmp-1726883064.9494412-32526-218215157438979 `" ) && sleep 0' 28983 1726883064.95647: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883064.95706: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883064.95806: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883064.97982: stdout chunk (state=3): >>>ansible-tmp-1726883064.9494412-32526-218215157438979=/root/.ansible/tmp/ansible-tmp-1726883064.9494412-32526-218215157438979 <<< 28983 1726883064.98100: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883064.98142: stderr chunk (state=3): >>><<< 28983 1726883064.98181: stdout chunk (state=3): >>><<< 28983 1726883064.98440: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726883064.9494412-32526-218215157438979=/root/.ansible/tmp/ansible-tmp-1726883064.9494412-32526-218215157438979 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883064.98444: variable 'ansible_module_compression' from source: unknown 28983 1726883064.98447: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 28983 1726883064.98450: variable 'ansible_facts' from source: unknown 28983 1726883064.98453: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726883064.9494412-32526-218215157438979/AnsiballZ_service_facts.py 28983 1726883064.98592: Sending initial data 28983 1726883064.98693: Sent initial data (162 bytes) 28983 1726883064.99250: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883064.99350: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883064.99397: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883064.99544: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883064.99651: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883065.01431: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726883065.01512: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726883065.01577: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpf_qz6c8t /root/.ansible/tmp/ansible-tmp-1726883064.9494412-32526-218215157438979/AnsiballZ_service_facts.py <<< 28983 1726883065.01607: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726883064.9494412-32526-218215157438979/AnsiballZ_service_facts.py" <<< 28983 1726883065.01677: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpf_qz6c8t" to remote "/root/.ansible/tmp/ansible-tmp-1726883064.9494412-32526-218215157438979/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726883064.9494412-32526-218215157438979/AnsiballZ_service_facts.py" <<< 28983 1726883065.03049: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883065.03053: stdout chunk (state=3): >>><<< 28983 1726883065.03055: stderr chunk (state=3): >>><<< 28983 1726883065.03178: done transferring module to remote 28983 1726883065.03182: _low_level_execute_command(): starting 28983 1726883065.03184: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726883064.9494412-32526-218215157438979/ /root/.ansible/tmp/ansible-tmp-1726883064.9494412-32526-218215157438979/AnsiballZ_service_facts.py && sleep 0' 28983 1726883065.03736: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883065.03754: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883065.03785: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883065.03849: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883065.03946: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883065.03975: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883065.04018: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883065.04130: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883065.06251: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883065.06267: stdout chunk (state=3): >>><<< 28983 1726883065.06340: stderr chunk (state=3): >>><<< 28983 1726883065.06344: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883065.06354: _low_level_execute_command(): starting 28983 1726883065.06357: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726883064.9494412-32526-218215157438979/AnsiballZ_service_facts.py && sleep 0' 28983 1726883065.07017: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883065.07036: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883065.07058: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883065.07090: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883065.07250: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726883065.07253: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883065.07316: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883065.07388: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883067.03716: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"network": {"name": "network", "state": "running", "status": "enabled", "source": "sysv"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "import-state.service": {"name": "import-state.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "ip6tables.service": {"name": "ip6tables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "iptables.service": {"name": "iptables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "generated", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "loadmodules.service": {"name": "loadmodules.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 28983 1726883067.05218: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883067.05251: stderr chunk (state=3): >>>Shared connection to 10.31.46.139 closed. <<< 28983 1726883067.05345: stderr chunk (state=3): >>><<< 28983 1726883067.05355: stdout chunk (state=3): >>><<< 28983 1726883067.05394: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"network": {"name": "network", "state": "running", "status": "enabled", "source": "sysv"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "import-state.service": {"name": "import-state.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "ip6tables.service": {"name": "ip6tables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "iptables.service": {"name": "iptables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "generated", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "loadmodules.service": {"name": "loadmodules.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726883067.06715: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726883064.9494412-32526-218215157438979/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726883067.06732: _low_level_execute_command(): starting 28983 1726883067.06746: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726883064.9494412-32526-218215157438979/ > /dev/null 2>&1 && sleep 0' 28983 1726883067.07411: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883067.07427: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883067.07446: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883067.07475: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883067.07593: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883067.07615: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883067.07720: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883067.09846: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883067.09850: stdout chunk (state=3): >>><<< 28983 1726883067.09852: stderr chunk (state=3): >>><<< 28983 1726883067.09855: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883067.09857: handler run complete 28983 1726883067.10137: variable 'ansible_facts' from source: unknown 28983 1726883067.10395: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883067.11351: variable 'ansible_facts' from source: unknown 28983 1726883067.11619: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883067.12022: attempt loop complete, returning result 28983 1726883067.12056: _execute() done 28983 1726883067.12060: dumping result to json 28983 1726883067.12148: done dumping result, returning 28983 1726883067.12340: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running [0affe814-3a2d-b16d-c0a7-00000000189c] 28983 1726883067.12343: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000189c ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28983 1726883067.13928: no more pending results, returning what we have 28983 1726883067.13931: results queue empty 28983 1726883067.13932: checking for any_errors_fatal 28983 1726883067.13943: done checking for any_errors_fatal 28983 1726883067.13945: checking for max_fail_percentage 28983 1726883067.13947: done checking for max_fail_percentage 28983 1726883067.13948: checking to see if all hosts have failed and the running result is not ok 28983 1726883067.13949: done checking to see if all hosts have failed 28983 1726883067.13950: getting the remaining hosts for this loop 28983 1726883067.13952: done getting the remaining hosts for this loop 28983 1726883067.13957: getting the next task for host managed_node2 28983 1726883067.13965: done getting next task for host managed_node2 28983 1726883067.13969: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 28983 1726883067.13979: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883067.13992: getting variables 28983 1726883067.13994: in VariableManager get_vars() 28983 1726883067.14033: Calling all_inventory to load vars for managed_node2 28983 1726883067.14166: Calling groups_inventory to load vars for managed_node2 28983 1726883067.14170: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883067.14183: Calling all_plugins_play to load vars for managed_node2 28983 1726883067.14187: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883067.14191: Calling groups_plugins_play to load vars for managed_node2 28983 1726883067.14782: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000189c 28983 1726883067.14785: WORKER PROCESS EXITING 28983 1726883067.16666: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883067.19985: done with get_vars() 28983 1726883067.20022: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:44:27 -0400 (0:00:02.296) 0:01:37.199 ****** 28983 1726883067.20142: entering _queue_task() for managed_node2/package_facts 28983 1726883067.20491: worker is 1 (out of 1 available) 28983 1726883067.20505: exiting _queue_task() for managed_node2/package_facts 28983 1726883067.20518: done queuing things up, now waiting for results queue to drain 28983 1726883067.20520: waiting for pending results... 28983 1726883067.20841: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 28983 1726883067.21042: in run() - task 0affe814-3a2d-b16d-c0a7-00000000189d 28983 1726883067.21069: variable 'ansible_search_path' from source: unknown 28983 1726883067.21077: variable 'ansible_search_path' from source: unknown 28983 1726883067.21119: calling self._execute() 28983 1726883067.21240: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883067.21254: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883067.21271: variable 'omit' from source: magic vars 28983 1726883067.21723: variable 'ansible_distribution_major_version' from source: facts 28983 1726883067.21745: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883067.21760: variable 'omit' from source: magic vars 28983 1726883067.21872: variable 'omit' from source: magic vars 28983 1726883067.21917: variable 'omit' from source: magic vars 28983 1726883067.21969: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883067.22024: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883067.22059: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883067.22095: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883067.22116: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883067.22162: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883067.22172: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883067.22180: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883067.22309: Set connection var ansible_connection to ssh 28983 1726883067.22329: Set connection var ansible_shell_executable to /bin/sh 28983 1726883067.22346: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883067.22361: Set connection var ansible_timeout to 10 28983 1726883067.22414: Set connection var ansible_pipelining to False 28983 1726883067.22417: Set connection var ansible_shell_type to sh 28983 1726883067.22420: variable 'ansible_shell_executable' from source: unknown 28983 1726883067.22422: variable 'ansible_connection' from source: unknown 28983 1726883067.22425: variable 'ansible_module_compression' from source: unknown 28983 1726883067.22431: variable 'ansible_shell_type' from source: unknown 28983 1726883067.22441: variable 'ansible_shell_executable' from source: unknown 28983 1726883067.22449: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883067.22457: variable 'ansible_pipelining' from source: unknown 28983 1726883067.22465: variable 'ansible_timeout' from source: unknown 28983 1726883067.22473: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883067.22741: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28983 1726883067.22746: variable 'omit' from source: magic vars 28983 1726883067.22748: starting attempt loop 28983 1726883067.22752: running the handler 28983 1726883067.22772: _low_level_execute_command(): starting 28983 1726883067.22786: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726883067.23635: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883067.23684: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883067.23713: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883067.23730: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883067.23843: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883067.25606: stdout chunk (state=3): >>>/root <<< 28983 1726883067.25900: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883067.25905: stdout chunk (state=3): >>><<< 28983 1726883067.25907: stderr chunk (state=3): >>><<< 28983 1726883067.25910: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883067.25913: _low_level_execute_command(): starting 28983 1726883067.25916: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726883067.258313-32603-90531595504525 `" && echo ansible-tmp-1726883067.258313-32603-90531595504525="` echo /root/.ansible/tmp/ansible-tmp-1726883067.258313-32603-90531595504525 `" ) && sleep 0' 28983 1726883067.26517: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883067.26532: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883067.26554: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883067.26588: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883067.26610: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28983 1726883067.26721: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 28983 1726883067.26739: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883067.26761: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883067.26863: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883067.28898: stdout chunk (state=3): >>>ansible-tmp-1726883067.258313-32603-90531595504525=/root/.ansible/tmp/ansible-tmp-1726883067.258313-32603-90531595504525 <<< 28983 1726883067.29083: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883067.29114: stdout chunk (state=3): >>><<< 28983 1726883067.29117: stderr chunk (state=3): >>><<< 28983 1726883067.29153: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726883067.258313-32603-90531595504525=/root/.ansible/tmp/ansible-tmp-1726883067.258313-32603-90531595504525 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883067.29223: variable 'ansible_module_compression' from source: unknown 28983 1726883067.29263: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 28983 1726883067.29343: variable 'ansible_facts' from source: unknown 28983 1726883067.29536: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726883067.258313-32603-90531595504525/AnsiballZ_package_facts.py 28983 1726883067.29790: Sending initial data 28983 1726883067.29794: Sent initial data (160 bytes) 28983 1726883067.30568: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883067.30688: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883067.30715: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883067.30824: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883067.32482: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 28983 1726883067.32508: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726883067.32581: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726883067.32668: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmppt4y5ws5 /root/.ansible/tmp/ansible-tmp-1726883067.258313-32603-90531595504525/AnsiballZ_package_facts.py <<< 28983 1726883067.32675: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726883067.258313-32603-90531595504525/AnsiballZ_package_facts.py" <<< 28983 1726883067.32730: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmppt4y5ws5" to remote "/root/.ansible/tmp/ansible-tmp-1726883067.258313-32603-90531595504525/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726883067.258313-32603-90531595504525/AnsiballZ_package_facts.py" <<< 28983 1726883067.35409: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883067.35558: stderr chunk (state=3): >>><<< 28983 1726883067.35562: stdout chunk (state=3): >>><<< 28983 1726883067.35564: done transferring module to remote 28983 1726883067.35566: _low_level_execute_command(): starting 28983 1726883067.35569: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726883067.258313-32603-90531595504525/ /root/.ansible/tmp/ansible-tmp-1726883067.258313-32603-90531595504525/AnsiballZ_package_facts.py && sleep 0' 28983 1726883067.36128: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883067.36146: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883067.36159: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883067.36189: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883067.36207: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726883067.36296: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883067.36339: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883067.36360: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883067.36379: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883067.36487: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883067.38493: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883067.38496: stdout chunk (state=3): >>><<< 28983 1726883067.38499: stderr chunk (state=3): >>><<< 28983 1726883067.38520: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883067.38530: _low_level_execute_command(): starting 28983 1726883067.38616: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726883067.258313-32603-90531595504525/AnsiballZ_package_facts.py && sleep 0' 28983 1726883067.39193: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883067.39202: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883067.39214: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883067.39231: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883067.39245: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726883067.39259: stderr chunk (state=3): >>>debug2: match not found <<< 28983 1726883067.39404: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883067.39408: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883067.39496: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883068.02990: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "9.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chkconfig": [{"name": "chkconfig", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts": [{"name": "initscripts", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "network-scripts": [{"name": "network-scripts", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 28983 1726883068.04785: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883068.04789: stderr chunk (state=3): >>>Shared connection to 10.31.46.139 closed. <<< 28983 1726883068.04792: stdout chunk (state=3): >>><<< 28983 1726883068.04794: stderr chunk (state=3): >>><<< 28983 1726883068.04806: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "9.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chkconfig": [{"name": "chkconfig", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts": [{"name": "initscripts", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "network-scripts": [{"name": "network-scripts", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726883068.09294: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726883067.258313-32603-90531595504525/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726883068.09298: _low_level_execute_command(): starting 28983 1726883068.09301: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726883067.258313-32603-90531595504525/ > /dev/null 2>&1 && sleep 0' 28983 1726883068.09944: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883068.09948: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883068.09951: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883068.09954: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883068.09956: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726883068.09958: stderr chunk (state=3): >>>debug2: match not found <<< 28983 1726883068.10012: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883068.10059: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883068.10076: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883068.10094: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883068.10242: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883068.12441: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883068.12444: stdout chunk (state=3): >>><<< 28983 1726883068.12447: stderr chunk (state=3): >>><<< 28983 1726883068.12449: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883068.12452: handler run complete 28983 1726883068.15209: variable 'ansible_facts' from source: unknown 28983 1726883068.16806: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883068.21782: variable 'ansible_facts' from source: unknown 28983 1726883068.22799: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883068.23915: attempt loop complete, returning result 28983 1726883068.23935: _execute() done 28983 1726883068.23938: dumping result to json 28983 1726883068.24126: done dumping result, returning 28983 1726883068.24136: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affe814-3a2d-b16d-c0a7-00000000189d] 28983 1726883068.24142: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000189d 28983 1726883068.28368: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000189d 28983 1726883068.28372: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28983 1726883068.28547: no more pending results, returning what we have 28983 1726883068.28551: results queue empty 28983 1726883068.28552: checking for any_errors_fatal 28983 1726883068.28562: done checking for any_errors_fatal 28983 1726883068.28563: checking for max_fail_percentage 28983 1726883068.28565: done checking for max_fail_percentage 28983 1726883068.28566: checking to see if all hosts have failed and the running result is not ok 28983 1726883068.28567: done checking to see if all hosts have failed 28983 1726883068.28568: getting the remaining hosts for this loop 28983 1726883068.28570: done getting the remaining hosts for this loop 28983 1726883068.28574: getting the next task for host managed_node2 28983 1726883068.28584: done getting next task for host managed_node2 28983 1726883068.28588: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 28983 1726883068.28594: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883068.28612: getting variables 28983 1726883068.28613: in VariableManager get_vars() 28983 1726883068.28653: Calling all_inventory to load vars for managed_node2 28983 1726883068.28657: Calling groups_inventory to load vars for managed_node2 28983 1726883068.28660: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883068.28675: Calling all_plugins_play to load vars for managed_node2 28983 1726883068.28679: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883068.28683: Calling groups_plugins_play to load vars for managed_node2 28983 1726883068.32061: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883068.36585: done with get_vars() 28983 1726883068.36627: done getting variables 28983 1726883068.36857: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:44:28 -0400 (0:00:01.168) 0:01:38.367 ****** 28983 1726883068.37016: entering _queue_task() for managed_node2/debug 28983 1726883068.37833: worker is 1 (out of 1 available) 28983 1726883068.37850: exiting _queue_task() for managed_node2/debug 28983 1726883068.37864: done queuing things up, now waiting for results queue to drain 28983 1726883068.37866: waiting for pending results... 28983 1726883068.38220: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 28983 1726883068.38367: in run() - task 0affe814-3a2d-b16d-c0a7-00000000183b 28983 1726883068.38387: variable 'ansible_search_path' from source: unknown 28983 1726883068.38392: variable 'ansible_search_path' from source: unknown 28983 1726883068.38536: calling self._execute() 28983 1726883068.38552: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883068.38562: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883068.38574: variable 'omit' from source: magic vars 28983 1726883068.39045: variable 'ansible_distribution_major_version' from source: facts 28983 1726883068.39065: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883068.39076: variable 'omit' from source: magic vars 28983 1726883068.39154: variable 'omit' from source: magic vars 28983 1726883068.39288: variable 'network_provider' from source: set_fact 28983 1726883068.39309: variable 'omit' from source: magic vars 28983 1726883068.39359: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883068.39409: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883068.39432: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883068.39455: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883068.39516: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883068.39520: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883068.39524: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883068.39527: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883068.39646: Set connection var ansible_connection to ssh 28983 1726883068.39660: Set connection var ansible_shell_executable to /bin/sh 28983 1726883068.39671: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883068.39686: Set connection var ansible_timeout to 10 28983 1726883068.39694: Set connection var ansible_pipelining to False 28983 1726883068.39697: Set connection var ansible_shell_type to sh 28983 1726883068.39734: variable 'ansible_shell_executable' from source: unknown 28983 1726883068.39740: variable 'ansible_connection' from source: unknown 28983 1726883068.39743: variable 'ansible_module_compression' from source: unknown 28983 1726883068.39746: variable 'ansible_shell_type' from source: unknown 28983 1726883068.39749: variable 'ansible_shell_executable' from source: unknown 28983 1726883068.39751: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883068.39757: variable 'ansible_pipelining' from source: unknown 28983 1726883068.39759: variable 'ansible_timeout' from source: unknown 28983 1726883068.39766: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883068.39966: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883068.39970: variable 'omit' from source: magic vars 28983 1726883068.39975: starting attempt loop 28983 1726883068.39978: running the handler 28983 1726883068.40028: handler run complete 28983 1726883068.40053: attempt loop complete, returning result 28983 1726883068.40064: _execute() done 28983 1726883068.40067: dumping result to json 28983 1726883068.40070: done dumping result, returning 28983 1726883068.40075: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [0affe814-3a2d-b16d-c0a7-00000000183b] 28983 1726883068.40078: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000183b 28983 1726883068.40330: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000183b 28983 1726883068.40333: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: Using network provider: nm 28983 1726883068.40426: no more pending results, returning what we have 28983 1726883068.40429: results queue empty 28983 1726883068.40430: checking for any_errors_fatal 28983 1726883068.40439: done checking for any_errors_fatal 28983 1726883068.40440: checking for max_fail_percentage 28983 1726883068.40442: done checking for max_fail_percentage 28983 1726883068.40443: checking to see if all hosts have failed and the running result is not ok 28983 1726883068.40444: done checking to see if all hosts have failed 28983 1726883068.40445: getting the remaining hosts for this loop 28983 1726883068.40447: done getting the remaining hosts for this loop 28983 1726883068.40451: getting the next task for host managed_node2 28983 1726883068.40459: done getting next task for host managed_node2 28983 1726883068.40463: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 28983 1726883068.40469: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883068.40626: getting variables 28983 1726883068.40628: in VariableManager get_vars() 28983 1726883068.40676: Calling all_inventory to load vars for managed_node2 28983 1726883068.40679: Calling groups_inventory to load vars for managed_node2 28983 1726883068.40682: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883068.40692: Calling all_plugins_play to load vars for managed_node2 28983 1726883068.40696: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883068.40700: Calling groups_plugins_play to load vars for managed_node2 28983 1726883068.43402: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883068.46703: done with get_vars() 28983 1726883068.46741: done getting variables 28983 1726883068.46818: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:44:28 -0400 (0:00:00.098) 0:01:38.466 ****** 28983 1726883068.46870: entering _queue_task() for managed_node2/fail 28983 1726883068.47453: worker is 1 (out of 1 available) 28983 1726883068.47464: exiting _queue_task() for managed_node2/fail 28983 1726883068.47477: done queuing things up, now waiting for results queue to drain 28983 1726883068.47479: waiting for pending results... 28983 1726883068.47611: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 28983 1726883068.47882: in run() - task 0affe814-3a2d-b16d-c0a7-00000000183c 28983 1726883068.47886: variable 'ansible_search_path' from source: unknown 28983 1726883068.47889: variable 'ansible_search_path' from source: unknown 28983 1726883068.47892: calling self._execute() 28983 1726883068.48016: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883068.48038: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883068.48057: variable 'omit' from source: magic vars 28983 1726883068.48565: variable 'ansible_distribution_major_version' from source: facts 28983 1726883068.48594: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883068.48776: variable 'network_state' from source: role '' defaults 28983 1726883068.48861: Evaluated conditional (network_state != {}): False 28983 1726883068.48864: when evaluation is False, skipping this task 28983 1726883068.48867: _execute() done 28983 1726883068.48869: dumping result to json 28983 1726883068.48874: done dumping result, returning 28983 1726883068.48877: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affe814-3a2d-b16d-c0a7-00000000183c] 28983 1726883068.48880: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000183c skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28983 1726883068.49138: no more pending results, returning what we have 28983 1726883068.49143: results queue empty 28983 1726883068.49144: checking for any_errors_fatal 28983 1726883068.49151: done checking for any_errors_fatal 28983 1726883068.49152: checking for max_fail_percentage 28983 1726883068.49154: done checking for max_fail_percentage 28983 1726883068.49155: checking to see if all hosts have failed and the running result is not ok 28983 1726883068.49156: done checking to see if all hosts have failed 28983 1726883068.49157: getting the remaining hosts for this loop 28983 1726883068.49159: done getting the remaining hosts for this loop 28983 1726883068.49165: getting the next task for host managed_node2 28983 1726883068.49177: done getting next task for host managed_node2 28983 1726883068.49182: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 28983 1726883068.49192: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883068.49221: getting variables 28983 1726883068.49223: in VariableManager get_vars() 28983 1726883068.49393: Calling all_inventory to load vars for managed_node2 28983 1726883068.49397: Calling groups_inventory to load vars for managed_node2 28983 1726883068.49400: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883068.49469: Calling all_plugins_play to load vars for managed_node2 28983 1726883068.49476: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883068.49481: Calling groups_plugins_play to load vars for managed_node2 28983 1726883068.50138: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000183c 28983 1726883068.50141: WORKER PROCESS EXITING 28983 1726883068.51910: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883068.55307: done with get_vars() 28983 1726883068.55346: done getting variables 28983 1726883068.55419: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:44:28 -0400 (0:00:00.085) 0:01:38.552 ****** 28983 1726883068.55469: entering _queue_task() for managed_node2/fail 28983 1726883068.56054: worker is 1 (out of 1 available) 28983 1726883068.56065: exiting _queue_task() for managed_node2/fail 28983 1726883068.56079: done queuing things up, now waiting for results queue to drain 28983 1726883068.56082: waiting for pending results... 28983 1726883068.56170: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 28983 1726883068.56359: in run() - task 0affe814-3a2d-b16d-c0a7-00000000183d 28983 1726883068.56385: variable 'ansible_search_path' from source: unknown 28983 1726883068.56395: variable 'ansible_search_path' from source: unknown 28983 1726883068.56456: calling self._execute() 28983 1726883068.56589: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883068.56604: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883068.56622: variable 'omit' from source: magic vars 28983 1726883068.57121: variable 'ansible_distribution_major_version' from source: facts 28983 1726883068.57142: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883068.57330: variable 'network_state' from source: role '' defaults 28983 1726883068.57353: Evaluated conditional (network_state != {}): False 28983 1726883068.57363: when evaluation is False, skipping this task 28983 1726883068.57371: _execute() done 28983 1726883068.57399: dumping result to json 28983 1726883068.57402: done dumping result, returning 28983 1726883068.57510: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affe814-3a2d-b16d-c0a7-00000000183d] 28983 1726883068.57515: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000183d 28983 1726883068.57594: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000183d 28983 1726883068.57597: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28983 1726883068.57655: no more pending results, returning what we have 28983 1726883068.57660: results queue empty 28983 1726883068.57661: checking for any_errors_fatal 28983 1726883068.57675: done checking for any_errors_fatal 28983 1726883068.57676: checking for max_fail_percentage 28983 1726883068.57679: done checking for max_fail_percentage 28983 1726883068.57680: checking to see if all hosts have failed and the running result is not ok 28983 1726883068.57681: done checking to see if all hosts have failed 28983 1726883068.57682: getting the remaining hosts for this loop 28983 1726883068.57684: done getting the remaining hosts for this loop 28983 1726883068.57690: getting the next task for host managed_node2 28983 1726883068.57700: done getting next task for host managed_node2 28983 1726883068.57705: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 28983 1726883068.57712: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883068.57742: getting variables 28983 1726883068.57744: in VariableManager get_vars() 28983 1726883068.57799: Calling all_inventory to load vars for managed_node2 28983 1726883068.57803: Calling groups_inventory to load vars for managed_node2 28983 1726883068.57806: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883068.57818: Calling all_plugins_play to load vars for managed_node2 28983 1726883068.57822: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883068.57826: Calling groups_plugins_play to load vars for managed_node2 28983 1726883068.60477: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883068.64196: done with get_vars() 28983 1726883068.64239: done getting variables 28983 1726883068.64322: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:44:28 -0400 (0:00:00.088) 0:01:38.641 ****** 28983 1726883068.64365: entering _queue_task() for managed_node2/fail 28983 1726883068.64968: worker is 1 (out of 1 available) 28983 1726883068.64979: exiting _queue_task() for managed_node2/fail 28983 1726883068.64991: done queuing things up, now waiting for results queue to drain 28983 1726883068.64993: waiting for pending results... 28983 1726883068.66397: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 28983 1726883068.66402: in run() - task 0affe814-3a2d-b16d-c0a7-00000000183e 28983 1726883068.66406: variable 'ansible_search_path' from source: unknown 28983 1726883068.66408: variable 'ansible_search_path' from source: unknown 28983 1726883068.66510: calling self._execute() 28983 1726883068.66930: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883068.66937: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883068.66940: variable 'omit' from source: magic vars 28983 1726883068.67807: variable 'ansible_distribution_major_version' from source: facts 28983 1726883068.67878: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883068.68439: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883068.71221: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883068.71307: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883068.71365: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883068.71413: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883068.71460: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883068.71561: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883068.71602: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883068.71640: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883068.71700: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883068.71722: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883068.71838: variable 'ansible_distribution_major_version' from source: facts 28983 1726883068.71860: Evaluated conditional (ansible_distribution_major_version | int > 9): True 28983 1726883068.72011: variable 'ansible_distribution' from source: facts 28983 1726883068.72022: variable '__network_rh_distros' from source: role '' defaults 28983 1726883068.72038: Evaluated conditional (ansible_distribution in __network_rh_distros): False 28983 1726883068.72108: when evaluation is False, skipping this task 28983 1726883068.72113: _execute() done 28983 1726883068.72116: dumping result to json 28983 1726883068.72118: done dumping result, returning 28983 1726883068.72121: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affe814-3a2d-b16d-c0a7-00000000183e] 28983 1726883068.72123: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000183e skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution in __network_rh_distros", "skip_reason": "Conditional result was False" } 28983 1726883068.72395: no more pending results, returning what we have 28983 1726883068.72399: results queue empty 28983 1726883068.72400: checking for any_errors_fatal 28983 1726883068.72407: done checking for any_errors_fatal 28983 1726883068.72408: checking for max_fail_percentage 28983 1726883068.72410: done checking for max_fail_percentage 28983 1726883068.72411: checking to see if all hosts have failed and the running result is not ok 28983 1726883068.72412: done checking to see if all hosts have failed 28983 1726883068.72413: getting the remaining hosts for this loop 28983 1726883068.72415: done getting the remaining hosts for this loop 28983 1726883068.72420: getting the next task for host managed_node2 28983 1726883068.72430: done getting next task for host managed_node2 28983 1726883068.72437: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 28983 1726883068.72443: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883068.72471: getting variables 28983 1726883068.72473: in VariableManager get_vars() 28983 1726883068.72530: Calling all_inventory to load vars for managed_node2 28983 1726883068.72737: Calling groups_inventory to load vars for managed_node2 28983 1726883068.72741: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883068.72752: Calling all_plugins_play to load vars for managed_node2 28983 1726883068.72756: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883068.72760: Calling groups_plugins_play to load vars for managed_node2 28983 1726883068.73451: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000183e 28983 1726883068.73457: WORKER PROCESS EXITING 28983 1726883068.75237: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883068.78806: done with get_vars() 28983 1726883068.78849: done getting variables 28983 1726883068.78918: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:44:28 -0400 (0:00:00.145) 0:01:38.787 ****** 28983 1726883068.78960: entering _queue_task() for managed_node2/dnf 28983 1726883068.79307: worker is 1 (out of 1 available) 28983 1726883068.79320: exiting _queue_task() for managed_node2/dnf 28983 1726883068.79438: done queuing things up, now waiting for results queue to drain 28983 1726883068.79441: waiting for pending results... 28983 1726883068.79658: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 28983 1726883068.79837: in run() - task 0affe814-3a2d-b16d-c0a7-00000000183f 28983 1726883068.79860: variable 'ansible_search_path' from source: unknown 28983 1726883068.79870: variable 'ansible_search_path' from source: unknown 28983 1726883068.80001: calling self._execute() 28983 1726883068.80117: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883068.80283: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883068.80301: variable 'omit' from source: magic vars 28983 1726883068.80771: variable 'ansible_distribution_major_version' from source: facts 28983 1726883068.80791: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883068.81066: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883068.85203: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883068.85447: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883068.85578: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883068.85648: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883068.85802: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883068.86020: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883068.86028: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883068.86083: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883068.86342: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883068.86346: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883068.86739: variable 'ansible_distribution' from source: facts 28983 1726883068.86742: variable 'ansible_distribution_major_version' from source: facts 28983 1726883068.86745: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 28983 1726883068.87241: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883068.87382: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883068.87417: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883068.87497: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883068.87624: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883068.87703: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883068.87764: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883068.87932: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883068.87974: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883068.88149: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883068.88175: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883068.88441: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883068.88444: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883068.88448: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883068.88478: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883068.88565: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883068.88978: variable 'network_connections' from source: include params 28983 1726883068.89057: variable 'interface' from source: play vars 28983 1726883068.89192: variable 'interface' from source: play vars 28983 1726883068.89307: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883068.89568: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883068.89624: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883068.89678: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883068.89722: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883068.89793: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883068.89828: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883068.89889: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883068.89929: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883068.90017: variable '__network_team_connections_defined' from source: role '' defaults 28983 1726883068.90381: variable 'network_connections' from source: include params 28983 1726883068.90406: variable 'interface' from source: play vars 28983 1726883068.90515: variable 'interface' from source: play vars 28983 1726883068.90531: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 28983 1726883068.90543: when evaluation is False, skipping this task 28983 1726883068.90550: _execute() done 28983 1726883068.90559: dumping result to json 28983 1726883068.90567: done dumping result, returning 28983 1726883068.90610: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affe814-3a2d-b16d-c0a7-00000000183f] 28983 1726883068.90614: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000183f 28983 1726883068.90706: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000183f 28983 1726883068.90710: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 28983 1726883068.90785: no more pending results, returning what we have 28983 1726883068.90790: results queue empty 28983 1726883068.90791: checking for any_errors_fatal 28983 1726883068.90799: done checking for any_errors_fatal 28983 1726883068.90799: checking for max_fail_percentage 28983 1726883068.90802: done checking for max_fail_percentage 28983 1726883068.90803: checking to see if all hosts have failed and the running result is not ok 28983 1726883068.90810: done checking to see if all hosts have failed 28983 1726883068.90811: getting the remaining hosts for this loop 28983 1726883068.90813: done getting the remaining hosts for this loop 28983 1726883068.90818: getting the next task for host managed_node2 28983 1726883068.90827: done getting next task for host managed_node2 28983 1726883068.90832: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 28983 1726883068.90840: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883068.90862: getting variables 28983 1726883068.90865: in VariableManager get_vars() 28983 1726883068.90941: Calling all_inventory to load vars for managed_node2 28983 1726883068.90944: Calling groups_inventory to load vars for managed_node2 28983 1726883068.90947: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883068.90959: Calling all_plugins_play to load vars for managed_node2 28983 1726883068.90966: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883068.90993: Calling groups_plugins_play to load vars for managed_node2 28983 1726883068.92783: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883068.94851: done with get_vars() 28983 1726883068.94890: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 28983 1726883068.94989: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:44:28 -0400 (0:00:00.160) 0:01:38.948 ****** 28983 1726883068.95027: entering _queue_task() for managed_node2/yum 28983 1726883068.95402: worker is 1 (out of 1 available) 28983 1726883068.95418: exiting _queue_task() for managed_node2/yum 28983 1726883068.95432: done queuing things up, now waiting for results queue to drain 28983 1726883068.95436: waiting for pending results... 28983 1726883068.95689: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 28983 1726883068.95795: in run() - task 0affe814-3a2d-b16d-c0a7-000000001840 28983 1726883068.95809: variable 'ansible_search_path' from source: unknown 28983 1726883068.95814: variable 'ansible_search_path' from source: unknown 28983 1726883068.95849: calling self._execute() 28983 1726883068.95937: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883068.95947: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883068.95955: variable 'omit' from source: magic vars 28983 1726883068.96278: variable 'ansible_distribution_major_version' from source: facts 28983 1726883068.96290: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883068.96445: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883068.98963: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883068.99010: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883068.99058: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883068.99208: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883068.99311: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883068.99379: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883068.99426: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883068.99452: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883068.99485: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883068.99498: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883068.99583: variable 'ansible_distribution_major_version' from source: facts 28983 1726883068.99596: Evaluated conditional (ansible_distribution_major_version | int < 8): False 28983 1726883068.99599: when evaluation is False, skipping this task 28983 1726883068.99602: _execute() done 28983 1726883068.99607: dumping result to json 28983 1726883068.99615: done dumping result, returning 28983 1726883068.99620: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affe814-3a2d-b16d-c0a7-000000001840] 28983 1726883068.99626: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001840 28983 1726883068.99727: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001840 28983 1726883068.99729: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 28983 1726883068.99794: no more pending results, returning what we have 28983 1726883068.99798: results queue empty 28983 1726883068.99799: checking for any_errors_fatal 28983 1726883068.99805: done checking for any_errors_fatal 28983 1726883068.99806: checking for max_fail_percentage 28983 1726883068.99808: done checking for max_fail_percentage 28983 1726883068.99809: checking to see if all hosts have failed and the running result is not ok 28983 1726883068.99810: done checking to see if all hosts have failed 28983 1726883068.99811: getting the remaining hosts for this loop 28983 1726883068.99813: done getting the remaining hosts for this loop 28983 1726883068.99817: getting the next task for host managed_node2 28983 1726883068.99826: done getting next task for host managed_node2 28983 1726883068.99830: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 28983 1726883068.99839: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883068.99867: getting variables 28983 1726883068.99868: in VariableManager get_vars() 28983 1726883068.99908: Calling all_inventory to load vars for managed_node2 28983 1726883068.99911: Calling groups_inventory to load vars for managed_node2 28983 1726883068.99914: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883068.99923: Calling all_plugins_play to load vars for managed_node2 28983 1726883068.99926: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883068.99929: Calling groups_plugins_play to load vars for managed_node2 28983 1726883069.01160: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883069.03253: done with get_vars() 28983 1726883069.03284: done getting variables 28983 1726883069.03328: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:44:29 -0400 (0:00:00.083) 0:01:39.031 ****** 28983 1726883069.03358: entering _queue_task() for managed_node2/fail 28983 1726883069.03576: worker is 1 (out of 1 available) 28983 1726883069.03590: exiting _queue_task() for managed_node2/fail 28983 1726883069.03603: done queuing things up, now waiting for results queue to drain 28983 1726883069.03605: waiting for pending results... 28983 1726883069.03793: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 28983 1726883069.03898: in run() - task 0affe814-3a2d-b16d-c0a7-000000001841 28983 1726883069.03911: variable 'ansible_search_path' from source: unknown 28983 1726883069.03914: variable 'ansible_search_path' from source: unknown 28983 1726883069.03953: calling self._execute() 28983 1726883069.04031: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883069.04038: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883069.04053: variable 'omit' from source: magic vars 28983 1726883069.04360: variable 'ansible_distribution_major_version' from source: facts 28983 1726883069.04369: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883069.04470: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883069.04639: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883069.06330: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883069.06385: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883069.06415: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883069.06451: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883069.06478: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883069.06543: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883069.06569: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883069.06594: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883069.06625: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883069.06639: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883069.06683: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883069.06705: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883069.06725: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883069.06758: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883069.06774: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883069.06810: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883069.06829: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883069.06853: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883069.06887: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883069.06899: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883069.07042: variable 'network_connections' from source: include params 28983 1726883069.07053: variable 'interface' from source: play vars 28983 1726883069.07108: variable 'interface' from source: play vars 28983 1726883069.07168: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883069.07593: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883069.07624: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883069.07651: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883069.07681: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883069.07716: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883069.07736: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883069.07757: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883069.07783: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883069.07833: variable '__network_team_connections_defined' from source: role '' defaults 28983 1726883069.08038: variable 'network_connections' from source: include params 28983 1726883069.08042: variable 'interface' from source: play vars 28983 1726883069.08096: variable 'interface' from source: play vars 28983 1726883069.08123: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 28983 1726883069.08127: when evaluation is False, skipping this task 28983 1726883069.08130: _execute() done 28983 1726883069.08133: dumping result to json 28983 1726883069.08139: done dumping result, returning 28983 1726883069.08146: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affe814-3a2d-b16d-c0a7-000000001841] 28983 1726883069.08152: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001841 28983 1726883069.08250: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001841 28983 1726883069.08253: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 28983 1726883069.08312: no more pending results, returning what we have 28983 1726883069.08315: results queue empty 28983 1726883069.08316: checking for any_errors_fatal 28983 1726883069.08323: done checking for any_errors_fatal 28983 1726883069.08323: checking for max_fail_percentage 28983 1726883069.08325: done checking for max_fail_percentage 28983 1726883069.08326: checking to see if all hosts have failed and the running result is not ok 28983 1726883069.08327: done checking to see if all hosts have failed 28983 1726883069.08328: getting the remaining hosts for this loop 28983 1726883069.08330: done getting the remaining hosts for this loop 28983 1726883069.08337: getting the next task for host managed_node2 28983 1726883069.08345: done getting next task for host managed_node2 28983 1726883069.08350: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 28983 1726883069.08355: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883069.08378: getting variables 28983 1726883069.08380: in VariableManager get_vars() 28983 1726883069.08416: Calling all_inventory to load vars for managed_node2 28983 1726883069.08419: Calling groups_inventory to load vars for managed_node2 28983 1726883069.08422: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883069.08430: Calling all_plugins_play to load vars for managed_node2 28983 1726883069.08435: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883069.08446: Calling groups_plugins_play to load vars for managed_node2 28983 1726883069.13862: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883069.15449: done with get_vars() 28983 1726883069.15475: done getting variables 28983 1726883069.15515: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:44:29 -0400 (0:00:00.121) 0:01:39.153 ****** 28983 1726883069.15542: entering _queue_task() for managed_node2/package 28983 1726883069.15820: worker is 1 (out of 1 available) 28983 1726883069.15838: exiting _queue_task() for managed_node2/package 28983 1726883069.15851: done queuing things up, now waiting for results queue to drain 28983 1726883069.15854: waiting for pending results... 28983 1726883069.16051: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 28983 1726883069.16200: in run() - task 0affe814-3a2d-b16d-c0a7-000000001842 28983 1726883069.16212: variable 'ansible_search_path' from source: unknown 28983 1726883069.16218: variable 'ansible_search_path' from source: unknown 28983 1726883069.16250: calling self._execute() 28983 1726883069.16337: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883069.16345: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883069.16355: variable 'omit' from source: magic vars 28983 1726883069.16680: variable 'ansible_distribution_major_version' from source: facts 28983 1726883069.16692: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883069.16865: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883069.17096: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883069.17135: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883069.17193: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883069.17223: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883069.17323: variable 'network_packages' from source: role '' defaults 28983 1726883069.17416: variable '__network_provider_setup' from source: role '' defaults 28983 1726883069.17426: variable '__network_service_name_default_nm' from source: role '' defaults 28983 1726883069.17482: variable '__network_service_name_default_nm' from source: role '' defaults 28983 1726883069.17489: variable '__network_packages_default_nm' from source: role '' defaults 28983 1726883069.17546: variable '__network_packages_default_nm' from source: role '' defaults 28983 1726883069.17707: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883069.19269: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883069.19323: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883069.19357: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883069.19388: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883069.19419: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883069.19488: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883069.19512: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883069.19533: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883069.19568: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883069.19581: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883069.19621: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883069.19643: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883069.19663: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883069.19701: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883069.19713: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883069.19901: variable '__network_packages_default_gobject_packages' from source: role '' defaults 28983 1726883069.19992: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883069.20015: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883069.20040: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883069.20071: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883069.20085: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883069.20163: variable 'ansible_python' from source: facts 28983 1726883069.20178: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 28983 1726883069.20246: variable '__network_wpa_supplicant_required' from source: role '' defaults 28983 1726883069.20311: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 28983 1726883069.20418: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883069.20439: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883069.20464: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883069.20497: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883069.20509: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883069.20552: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883069.20578: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883069.20597: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883069.20627: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883069.20641: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883069.20761: variable 'network_connections' from source: include params 28983 1726883069.20766: variable 'interface' from source: play vars 28983 1726883069.20852: variable 'interface' from source: play vars 28983 1726883069.20917: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883069.20940: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883069.20965: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883069.20992: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883069.21037: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883069.21270: variable 'network_connections' from source: include params 28983 1726883069.21297: variable 'interface' from source: play vars 28983 1726883069.21391: variable 'interface' from source: play vars 28983 1726883069.21431: variable '__network_packages_default_wireless' from source: role '' defaults 28983 1726883069.21502: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883069.21755: variable 'network_connections' from source: include params 28983 1726883069.21758: variable 'interface' from source: play vars 28983 1726883069.21816: variable 'interface' from source: play vars 28983 1726883069.21838: variable '__network_packages_default_team' from source: role '' defaults 28983 1726883069.21905: variable '__network_team_connections_defined' from source: role '' defaults 28983 1726883069.22157: variable 'network_connections' from source: include params 28983 1726883069.22160: variable 'interface' from source: play vars 28983 1726883069.22219: variable 'interface' from source: play vars 28983 1726883069.22268: variable '__network_service_name_default_initscripts' from source: role '' defaults 28983 1726883069.22319: variable '__network_service_name_default_initscripts' from source: role '' defaults 28983 1726883069.22325: variable '__network_packages_default_initscripts' from source: role '' defaults 28983 1726883069.22379: variable '__network_packages_default_initscripts' from source: role '' defaults 28983 1726883069.22563: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 28983 1726883069.23039: variable 'network_connections' from source: include params 28983 1726883069.23043: variable 'interface' from source: play vars 28983 1726883069.23145: variable 'interface' from source: play vars 28983 1726883069.23149: variable 'ansible_distribution' from source: facts 28983 1726883069.23240: variable '__network_rh_distros' from source: role '' defaults 28983 1726883069.23243: variable 'ansible_distribution_major_version' from source: facts 28983 1726883069.23248: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 28983 1726883069.23413: variable 'ansible_distribution' from source: facts 28983 1726883069.23417: variable '__network_rh_distros' from source: role '' defaults 28983 1726883069.23441: variable 'ansible_distribution_major_version' from source: facts 28983 1726883069.23445: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 28983 1726883069.23643: variable 'ansible_distribution' from source: facts 28983 1726883069.23647: variable '__network_rh_distros' from source: role '' defaults 28983 1726883069.23656: variable 'ansible_distribution_major_version' from source: facts 28983 1726883069.23697: variable 'network_provider' from source: set_fact 28983 1726883069.23715: variable 'ansible_facts' from source: unknown 28983 1726883069.24667: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 28983 1726883069.24675: when evaluation is False, skipping this task 28983 1726883069.24678: _execute() done 28983 1726883069.24681: dumping result to json 28983 1726883069.24683: done dumping result, returning 28983 1726883069.24691: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [0affe814-3a2d-b16d-c0a7-000000001842] 28983 1726883069.24697: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001842 28983 1726883069.24801: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001842 28983 1726883069.24805: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 28983 1726883069.24865: no more pending results, returning what we have 28983 1726883069.24868: results queue empty 28983 1726883069.24869: checking for any_errors_fatal 28983 1726883069.24880: done checking for any_errors_fatal 28983 1726883069.24882: checking for max_fail_percentage 28983 1726883069.24884: done checking for max_fail_percentage 28983 1726883069.24885: checking to see if all hosts have failed and the running result is not ok 28983 1726883069.24886: done checking to see if all hosts have failed 28983 1726883069.24886: getting the remaining hosts for this loop 28983 1726883069.24889: done getting the remaining hosts for this loop 28983 1726883069.24894: getting the next task for host managed_node2 28983 1726883069.24902: done getting next task for host managed_node2 28983 1726883069.24906: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 28983 1726883069.24912: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883069.24938: getting variables 28983 1726883069.24940: in VariableManager get_vars() 28983 1726883069.24991: Calling all_inventory to load vars for managed_node2 28983 1726883069.24995: Calling groups_inventory to load vars for managed_node2 28983 1726883069.24997: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883069.25007: Calling all_plugins_play to load vars for managed_node2 28983 1726883069.25011: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883069.25014: Calling groups_plugins_play to load vars for managed_node2 28983 1726883069.26618: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883069.29630: done with get_vars() 28983 1726883069.29668: done getting variables 28983 1726883069.29739: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:44:29 -0400 (0:00:00.142) 0:01:39.295 ****** 28983 1726883069.29782: entering _queue_task() for managed_node2/package 28983 1726883069.30100: worker is 1 (out of 1 available) 28983 1726883069.30114: exiting _queue_task() for managed_node2/package 28983 1726883069.30129: done queuing things up, now waiting for results queue to drain 28983 1726883069.30130: waiting for pending results... 28983 1726883069.30556: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 28983 1726883069.30740: in run() - task 0affe814-3a2d-b16d-c0a7-000000001843 28983 1726883069.30745: variable 'ansible_search_path' from source: unknown 28983 1726883069.30747: variable 'ansible_search_path' from source: unknown 28983 1726883069.30750: calling self._execute() 28983 1726883069.30860: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883069.30883: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883069.30901: variable 'omit' from source: magic vars 28983 1726883069.31362: variable 'ansible_distribution_major_version' from source: facts 28983 1726883069.31385: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883069.31554: variable 'network_state' from source: role '' defaults 28983 1726883069.31575: Evaluated conditional (network_state != {}): False 28983 1726883069.31585: when evaluation is False, skipping this task 28983 1726883069.31592: _execute() done 28983 1726883069.31600: dumping result to json 28983 1726883069.31609: done dumping result, returning 28983 1726883069.31621: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affe814-3a2d-b16d-c0a7-000000001843] 28983 1726883069.31639: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001843 28983 1726883069.31819: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001843 28983 1726883069.31823: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28983 1726883069.31903: no more pending results, returning what we have 28983 1726883069.31908: results queue empty 28983 1726883069.31909: checking for any_errors_fatal 28983 1726883069.31917: done checking for any_errors_fatal 28983 1726883069.31918: checking for max_fail_percentage 28983 1726883069.31920: done checking for max_fail_percentage 28983 1726883069.31921: checking to see if all hosts have failed and the running result is not ok 28983 1726883069.31922: done checking to see if all hosts have failed 28983 1726883069.31923: getting the remaining hosts for this loop 28983 1726883069.31925: done getting the remaining hosts for this loop 28983 1726883069.31931: getting the next task for host managed_node2 28983 1726883069.31943: done getting next task for host managed_node2 28983 1726883069.31947: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 28983 1726883069.31955: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883069.31985: getting variables 28983 1726883069.31987: in VariableManager get_vars() 28983 1726883069.32238: Calling all_inventory to load vars for managed_node2 28983 1726883069.32242: Calling groups_inventory to load vars for managed_node2 28983 1726883069.32245: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883069.32254: Calling all_plugins_play to load vars for managed_node2 28983 1726883069.32257: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883069.32261: Calling groups_plugins_play to load vars for managed_node2 28983 1726883069.34787: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883069.37963: done with get_vars() 28983 1726883069.38003: done getting variables 28983 1726883069.38084: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:44:29 -0400 (0:00:00.083) 0:01:39.379 ****** 28983 1726883069.38126: entering _queue_task() for managed_node2/package 28983 1726883069.38474: worker is 1 (out of 1 available) 28983 1726883069.38489: exiting _queue_task() for managed_node2/package 28983 1726883069.38502: done queuing things up, now waiting for results queue to drain 28983 1726883069.38504: waiting for pending results... 28983 1726883069.38956: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 28983 1726883069.39040: in run() - task 0affe814-3a2d-b16d-c0a7-000000001844 28983 1726883069.39061: variable 'ansible_search_path' from source: unknown 28983 1726883069.39077: variable 'ansible_search_path' from source: unknown 28983 1726883069.39120: calling self._execute() 28983 1726883069.39241: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883069.39256: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883069.39278: variable 'omit' from source: magic vars 28983 1726883069.39742: variable 'ansible_distribution_major_version' from source: facts 28983 1726883069.39761: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883069.39929: variable 'network_state' from source: role '' defaults 28983 1726883069.40039: Evaluated conditional (network_state != {}): False 28983 1726883069.40042: when evaluation is False, skipping this task 28983 1726883069.40047: _execute() done 28983 1726883069.40049: dumping result to json 28983 1726883069.40052: done dumping result, returning 28983 1726883069.40054: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affe814-3a2d-b16d-c0a7-000000001844] 28983 1726883069.40057: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001844 28983 1726883069.40142: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001844 28983 1726883069.40145: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28983 1726883069.40201: no more pending results, returning what we have 28983 1726883069.40205: results queue empty 28983 1726883069.40206: checking for any_errors_fatal 28983 1726883069.40216: done checking for any_errors_fatal 28983 1726883069.40217: checking for max_fail_percentage 28983 1726883069.40219: done checking for max_fail_percentage 28983 1726883069.40221: checking to see if all hosts have failed and the running result is not ok 28983 1726883069.40222: done checking to see if all hosts have failed 28983 1726883069.40223: getting the remaining hosts for this loop 28983 1726883069.40225: done getting the remaining hosts for this loop 28983 1726883069.40230: getting the next task for host managed_node2 28983 1726883069.40243: done getting next task for host managed_node2 28983 1726883069.40247: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 28983 1726883069.40255: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883069.40287: getting variables 28983 1726883069.40289: in VariableManager get_vars() 28983 1726883069.40541: Calling all_inventory to load vars for managed_node2 28983 1726883069.40545: Calling groups_inventory to load vars for managed_node2 28983 1726883069.40548: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883069.40558: Calling all_plugins_play to load vars for managed_node2 28983 1726883069.40562: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883069.40565: Calling groups_plugins_play to load vars for managed_node2 28983 1726883069.43127: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883069.46431: done with get_vars() 28983 1726883069.46475: done getting variables 28983 1726883069.46551: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:44:29 -0400 (0:00:00.084) 0:01:39.463 ****** 28983 1726883069.46603: entering _queue_task() for managed_node2/service 28983 1726883069.46993: worker is 1 (out of 1 available) 28983 1726883069.47007: exiting _queue_task() for managed_node2/service 28983 1726883069.47022: done queuing things up, now waiting for results queue to drain 28983 1726883069.47024: waiting for pending results... 28983 1726883069.47367: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 28983 1726883069.47463: in run() - task 0affe814-3a2d-b16d-c0a7-000000001845 28983 1726883069.47467: variable 'ansible_search_path' from source: unknown 28983 1726883069.47471: variable 'ansible_search_path' from source: unknown 28983 1726883069.47574: calling self._execute() 28983 1726883069.47625: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883069.47632: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883069.47648: variable 'omit' from source: magic vars 28983 1726883069.48253: variable 'ansible_distribution_major_version' from source: facts 28983 1726883069.48257: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883069.48290: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883069.48743: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883069.51199: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883069.51297: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883069.51344: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883069.51388: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883069.51420: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883069.51522: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883069.51558: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883069.51596: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883069.51649: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883069.51667: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883069.51736: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883069.51764: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883069.51798: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883069.51853: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883069.51869: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883069.51925: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883069.52014: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883069.52018: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883069.52232: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883069.52237: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883069.52290: variable 'network_connections' from source: include params 28983 1726883069.52305: variable 'interface' from source: play vars 28983 1726883069.52390: variable 'interface' from source: play vars 28983 1726883069.52485: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883069.52697: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883069.52754: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883069.52795: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883069.52832: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883069.52885: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883069.52912: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883069.52946: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883069.52979: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883069.53047: variable '__network_team_connections_defined' from source: role '' defaults 28983 1726883069.53393: variable 'network_connections' from source: include params 28983 1726883069.53397: variable 'interface' from source: play vars 28983 1726883069.53639: variable 'interface' from source: play vars 28983 1726883069.53642: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 28983 1726883069.53646: when evaluation is False, skipping this task 28983 1726883069.53648: _execute() done 28983 1726883069.53650: dumping result to json 28983 1726883069.53651: done dumping result, returning 28983 1726883069.53653: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affe814-3a2d-b16d-c0a7-000000001845] 28983 1726883069.53655: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001845 28983 1726883069.53729: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001845 28983 1726883069.53740: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 28983 1726883069.53793: no more pending results, returning what we have 28983 1726883069.53797: results queue empty 28983 1726883069.53798: checking for any_errors_fatal 28983 1726883069.53805: done checking for any_errors_fatal 28983 1726883069.53806: checking for max_fail_percentage 28983 1726883069.53807: done checking for max_fail_percentage 28983 1726883069.53808: checking to see if all hosts have failed and the running result is not ok 28983 1726883069.53809: done checking to see if all hosts have failed 28983 1726883069.53810: getting the remaining hosts for this loop 28983 1726883069.53812: done getting the remaining hosts for this loop 28983 1726883069.53817: getting the next task for host managed_node2 28983 1726883069.53824: done getting next task for host managed_node2 28983 1726883069.53829: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 28983 1726883069.53836: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883069.53856: getting variables 28983 1726883069.53858: in VariableManager get_vars() 28983 1726883069.53901: Calling all_inventory to load vars for managed_node2 28983 1726883069.53904: Calling groups_inventory to load vars for managed_node2 28983 1726883069.53907: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883069.53916: Calling all_plugins_play to load vars for managed_node2 28983 1726883069.53919: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883069.53923: Calling groups_plugins_play to load vars for managed_node2 28983 1726883069.58297: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883069.62257: done with get_vars() 28983 1726883069.62309: done getting variables 28983 1726883069.62588: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:44:29 -0400 (0:00:00.160) 0:01:39.624 ****** 28983 1726883069.62632: entering _queue_task() for managed_node2/service 28983 1726883069.63428: worker is 1 (out of 1 available) 28983 1726883069.63444: exiting _queue_task() for managed_node2/service 28983 1726883069.63460: done queuing things up, now waiting for results queue to drain 28983 1726883069.63462: waiting for pending results... 28983 1726883069.64077: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 28983 1726883069.64389: in run() - task 0affe814-3a2d-b16d-c0a7-000000001846 28983 1726883069.64393: variable 'ansible_search_path' from source: unknown 28983 1726883069.64497: variable 'ansible_search_path' from source: unknown 28983 1726883069.64501: calling self._execute() 28983 1726883069.64705: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883069.64714: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883069.64841: variable 'omit' from source: magic vars 28983 1726883069.65810: variable 'ansible_distribution_major_version' from source: facts 28983 1726883069.65825: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883069.66275: variable 'network_provider' from source: set_fact 28983 1726883069.66280: variable 'network_state' from source: role '' defaults 28983 1726883069.66352: Evaluated conditional (network_provider == "nm" or network_state != {}): True 28983 1726883069.66356: variable 'omit' from source: magic vars 28983 1726883069.66471: variable 'omit' from source: magic vars 28983 1726883069.66506: variable 'network_service_name' from source: role '' defaults 28983 1726883069.66733: variable 'network_service_name' from source: role '' defaults 28983 1726883069.67031: variable '__network_provider_setup' from source: role '' defaults 28983 1726883069.67039: variable '__network_service_name_default_nm' from source: role '' defaults 28983 1726883069.67226: variable '__network_service_name_default_nm' from source: role '' defaults 28983 1726883069.67331: variable '__network_packages_default_nm' from source: role '' defaults 28983 1726883069.67336: variable '__network_packages_default_nm' from source: role '' defaults 28983 1726883069.67941: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883069.71732: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883069.71831: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883069.71990: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883069.71994: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883069.71997: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883069.72136: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883069.72140: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883069.72142: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883069.72382: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883069.72398: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883069.72459: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883069.72484: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883069.72512: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883069.72752: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883069.72756: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883069.73168: variable '__network_packages_default_gobject_packages' from source: role '' defaults 28983 1726883069.73523: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883069.73655: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883069.73690: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883069.73736: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883069.73754: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883069.74032: variable 'ansible_python' from source: facts 28983 1726883069.74055: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 28983 1726883069.74263: variable '__network_wpa_supplicant_required' from source: role '' defaults 28983 1726883069.74477: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 28983 1726883069.74860: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883069.74895: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883069.74936: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883069.75152: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883069.75156: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883069.75191: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883069.75275: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883069.75481: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883069.75485: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883069.75488: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883069.75891: variable 'network_connections' from source: include params 28983 1726883069.75899: variable 'interface' from source: play vars 28983 1726883069.76010: variable 'interface' from source: play vars 28983 1726883069.76343: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883069.76867: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883069.76927: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883069.77090: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883069.77223: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883069.77314: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883069.77487: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883069.77526: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883069.77570: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883069.77687: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883069.78446: variable 'network_connections' from source: include params 28983 1726883069.78548: variable 'interface' from source: play vars 28983 1726883069.78759: variable 'interface' from source: play vars 28983 1726883069.78819: variable '__network_packages_default_wireless' from source: role '' defaults 28983 1726883069.79121: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883069.79834: variable 'network_connections' from source: include params 28983 1726883069.79945: variable 'interface' from source: play vars 28983 1726883069.80052: variable 'interface' from source: play vars 28983 1726883069.80063: variable '__network_packages_default_team' from source: role '' defaults 28983 1726883069.80344: variable '__network_team_connections_defined' from source: role '' defaults 28983 1726883069.81239: variable 'network_connections' from source: include params 28983 1726883069.81243: variable 'interface' from source: play vars 28983 1726883069.81268: variable 'interface' from source: play vars 28983 1726883069.81521: variable '__network_service_name_default_initscripts' from source: role '' defaults 28983 1726883069.81661: variable '__network_service_name_default_initscripts' from source: role '' defaults 28983 1726883069.81665: variable '__network_packages_default_initscripts' from source: role '' defaults 28983 1726883069.81689: variable '__network_packages_default_initscripts' from source: role '' defaults 28983 1726883069.82358: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 28983 1726883069.84027: variable 'network_connections' from source: include params 28983 1726883069.84050: variable 'interface' from source: play vars 28983 1726883069.84109: variable 'interface' from source: play vars 28983 1726883069.84121: variable 'ansible_distribution' from source: facts 28983 1726883069.84124: variable '__network_rh_distros' from source: role '' defaults 28983 1726883069.84133: variable 'ansible_distribution_major_version' from source: facts 28983 1726883069.84365: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 28983 1726883069.84788: variable 'ansible_distribution' from source: facts 28983 1726883069.84792: variable '__network_rh_distros' from source: role '' defaults 28983 1726883069.84875: variable 'ansible_distribution_major_version' from source: facts 28983 1726883069.84878: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 28983 1726883069.85229: variable 'ansible_distribution' from source: facts 28983 1726883069.85233: variable '__network_rh_distros' from source: role '' defaults 28983 1726883069.85243: variable 'ansible_distribution_major_version' from source: facts 28983 1726883069.85286: variable 'network_provider' from source: set_fact 28983 1726883069.85317: variable 'omit' from source: magic vars 28983 1726883069.85636: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883069.85642: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883069.85645: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883069.85647: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883069.85649: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883069.85679: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883069.85683: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883069.85688: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883069.86141: Set connection var ansible_connection to ssh 28983 1726883069.86144: Set connection var ansible_shell_executable to /bin/sh 28983 1726883069.86147: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883069.86149: Set connection var ansible_timeout to 10 28983 1726883069.86151: Set connection var ansible_pipelining to False 28983 1726883069.86153: Set connection var ansible_shell_type to sh 28983 1726883069.86156: variable 'ansible_shell_executable' from source: unknown 28983 1726883069.86158: variable 'ansible_connection' from source: unknown 28983 1726883069.86160: variable 'ansible_module_compression' from source: unknown 28983 1726883069.86162: variable 'ansible_shell_type' from source: unknown 28983 1726883069.86164: variable 'ansible_shell_executable' from source: unknown 28983 1726883069.86166: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883069.86168: variable 'ansible_pipelining' from source: unknown 28983 1726883069.86170: variable 'ansible_timeout' from source: unknown 28983 1726883069.86175: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883069.86423: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883069.86437: variable 'omit' from source: magic vars 28983 1726883069.86445: starting attempt loop 28983 1726883069.86448: running the handler 28983 1726883069.86648: variable 'ansible_facts' from source: unknown 28983 1726883069.87781: _low_level_execute_command(): starting 28983 1726883069.87789: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726883069.88643: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883069.88648: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883069.88656: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883069.88659: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883069.88865: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883069.90632: stdout chunk (state=3): >>>/root <<< 28983 1726883069.90806: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883069.90810: stdout chunk (state=3): >>><<< 28983 1726883069.90819: stderr chunk (state=3): >>><<< 28983 1726883069.90842: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883069.90856: _low_level_execute_command(): starting 28983 1726883069.90862: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726883069.908422-32678-230101481225474 `" && echo ansible-tmp-1726883069.908422-32678-230101481225474="` echo /root/.ansible/tmp/ansible-tmp-1726883069.908422-32678-230101481225474 `" ) && sleep 0' 28983 1726883069.92628: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883069.92631: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726883069.92637: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration <<< 28983 1726883069.92640: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883069.92919: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883069.92930: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883069.92939: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883069.93091: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883069.93253: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883069.95749: stdout chunk (state=3): >>>ansible-tmp-1726883069.908422-32678-230101481225474=/root/.ansible/tmp/ansible-tmp-1726883069.908422-32678-230101481225474 <<< 28983 1726883069.95753: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883069.95763: stdout chunk (state=3): >>><<< 28983 1726883069.95766: stderr chunk (state=3): >>><<< 28983 1726883069.95790: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726883069.908422-32678-230101481225474=/root/.ansible/tmp/ansible-tmp-1726883069.908422-32678-230101481225474 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883069.95827: variable 'ansible_module_compression' from source: unknown 28983 1726883069.95887: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 28983 1726883069.96422: variable 'ansible_facts' from source: unknown 28983 1726883069.96617: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726883069.908422-32678-230101481225474/AnsiballZ_systemd.py 28983 1726883069.97560: Sending initial data 28983 1726883069.97564: Sent initial data (155 bytes) 28983 1726883069.98780: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883069.98790: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883069.98926: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883069.98933: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883069.98997: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883069.99051: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883070.00797: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726883070.00868: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726883070.00970: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpnrctgh4x /root/.ansible/tmp/ansible-tmp-1726883069.908422-32678-230101481225474/AnsiballZ_systemd.py <<< 28983 1726883070.00981: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726883069.908422-32678-230101481225474/AnsiballZ_systemd.py" <<< 28983 1726883070.01094: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpnrctgh4x" to remote "/root/.ansible/tmp/ansible-tmp-1726883069.908422-32678-230101481225474/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726883069.908422-32678-230101481225474/AnsiballZ_systemd.py" <<< 28983 1726883070.05855: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883070.06276: stderr chunk (state=3): >>><<< 28983 1726883070.06279: stdout chunk (state=3): >>><<< 28983 1726883070.06282: done transferring module to remote 28983 1726883070.06284: _low_level_execute_command(): starting 28983 1726883070.06287: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726883069.908422-32678-230101481225474/ /root/.ansible/tmp/ansible-tmp-1726883069.908422-32678-230101481225474/AnsiballZ_systemd.py && sleep 0' 28983 1726883070.07399: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883070.07404: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726883070.07407: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883070.07409: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883070.07412: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883070.07651: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883070.07677: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883070.09630: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883070.09710: stderr chunk (state=3): >>><<< 28983 1726883070.09721: stdout chunk (state=3): >>><<< 28983 1726883070.09747: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883070.09909: _low_level_execute_command(): starting 28983 1726883070.09912: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726883069.908422-32678-230101481225474/AnsiballZ_systemd.py && sleep 0' 28983 1726883070.11451: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883070.11454: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726883070.11457: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 28983 1726883070.11459: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883070.11565: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found <<< 28983 1726883070.11568: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883070.11670: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883070.11749: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883070.11850: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883070.44821: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3307", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ExecMainStartTimestampMonotonic": "237115079", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3307", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4837", "MemoryCurrent": "4456448", "MemoryAvailable": "infinity", "CPUUsageNSec": "1625477000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target NetworkManager-wait-online.service cloud-init.service network.target network.service multi-user.target", "After": "systemd-journald.socket dbus-broker.service dbus.socket system.slice cloud-init-local.service sysinit.target network-pre.target basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:40:53 EDT", "StateChangeTimestampMonotonic": "816797668", "InactiveExitTimestamp": "Fri 2024-09-20 21:31:13 EDT", "InactiveExitTimestampMonotonic": "237115438", "ActiveEnterTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ActiveEnterTimestampMonotonic": "237210316", "ActiveExitTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ActiveExitTimestampMonotonic": "237082173", "InactiveEnterTimestamp": "Fri 2024-09-20 21:31:13 EDT", "InactiveEnterTimestampMonotonic": "237109124", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ConditionTimestampMonotonic": "237109810", "AssertTimestamp": "Fri 2024-09-20 21:31:13 EDT", "AssertTimestampMonotonic": "237109813", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ac5f6302898c463683c4bdf580ab7e3e", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 28983 1726883070.46978: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. <<< 28983 1726883070.47342: stderr chunk (state=3): >>><<< 28983 1726883070.47345: stdout chunk (state=3): >>><<< 28983 1726883070.47350: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3307", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ExecMainStartTimestampMonotonic": "237115079", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3307", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4837", "MemoryCurrent": "4456448", "MemoryAvailable": "infinity", "CPUUsageNSec": "1625477000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target NetworkManager-wait-online.service cloud-init.service network.target network.service multi-user.target", "After": "systemd-journald.socket dbus-broker.service dbus.socket system.slice cloud-init-local.service sysinit.target network-pre.target basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:40:53 EDT", "StateChangeTimestampMonotonic": "816797668", "InactiveExitTimestamp": "Fri 2024-09-20 21:31:13 EDT", "InactiveExitTimestampMonotonic": "237115438", "ActiveEnterTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ActiveEnterTimestampMonotonic": "237210316", "ActiveExitTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ActiveExitTimestampMonotonic": "237082173", "InactiveEnterTimestamp": "Fri 2024-09-20 21:31:13 EDT", "InactiveEnterTimestampMonotonic": "237109124", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ConditionTimestampMonotonic": "237109810", "AssertTimestamp": "Fri 2024-09-20 21:31:13 EDT", "AssertTimestampMonotonic": "237109813", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ac5f6302898c463683c4bdf580ab7e3e", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726883070.47740: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726883069.908422-32678-230101481225474/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726883070.47800: _low_level_execute_command(): starting 28983 1726883070.47941: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726883069.908422-32678-230101481225474/ > /dev/null 2>&1 && sleep 0' 28983 1726883070.48617: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883070.48631: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883070.48652: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883070.48715: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883070.48759: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883070.48863: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883070.50895: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883070.50899: stdout chunk (state=3): >>><<< 28983 1726883070.50911: stderr chunk (state=3): >>><<< 28983 1726883070.50937: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883070.50945: handler run complete 28983 1726883070.51050: attempt loop complete, returning result 28983 1726883070.51054: _execute() done 28983 1726883070.51057: dumping result to json 28983 1726883070.51085: done dumping result, returning 28983 1726883070.51096: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affe814-3a2d-b16d-c0a7-000000001846] 28983 1726883070.51103: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001846 28983 1726883070.51999: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001846 28983 1726883070.52002: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28983 1726883070.52093: no more pending results, returning what we have 28983 1726883070.52097: results queue empty 28983 1726883070.52098: checking for any_errors_fatal 28983 1726883070.52103: done checking for any_errors_fatal 28983 1726883070.52104: checking for max_fail_percentage 28983 1726883070.52106: done checking for max_fail_percentage 28983 1726883070.52107: checking to see if all hosts have failed and the running result is not ok 28983 1726883070.52108: done checking to see if all hosts have failed 28983 1726883070.52109: getting the remaining hosts for this loop 28983 1726883070.52113: done getting the remaining hosts for this loop 28983 1726883070.52118: getting the next task for host managed_node2 28983 1726883070.52128: done getting next task for host managed_node2 28983 1726883070.52133: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 28983 1726883070.52144: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883070.52159: getting variables 28983 1726883070.52161: in VariableManager get_vars() 28983 1726883070.52203: Calling all_inventory to load vars for managed_node2 28983 1726883070.52207: Calling groups_inventory to load vars for managed_node2 28983 1726883070.52210: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883070.52220: Calling all_plugins_play to load vars for managed_node2 28983 1726883070.52224: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883070.52228: Calling groups_plugins_play to load vars for managed_node2 28983 1726883070.53760: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883070.56750: done with get_vars() 28983 1726883070.56777: done getting variables 28983 1726883070.56829: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:44:30 -0400 (0:00:00.942) 0:01:40.566 ****** 28983 1726883070.56865: entering _queue_task() for managed_node2/service 28983 1726883070.57133: worker is 1 (out of 1 available) 28983 1726883070.57150: exiting _queue_task() for managed_node2/service 28983 1726883070.57165: done queuing things up, now waiting for results queue to drain 28983 1726883070.57167: waiting for pending results... 28983 1726883070.57367: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 28983 1726883070.57503: in run() - task 0affe814-3a2d-b16d-c0a7-000000001847 28983 1726883070.57517: variable 'ansible_search_path' from source: unknown 28983 1726883070.57521: variable 'ansible_search_path' from source: unknown 28983 1726883070.57555: calling self._execute() 28983 1726883070.57668: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883070.57677: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883070.57710: variable 'omit' from source: magic vars 28983 1726883070.58297: variable 'ansible_distribution_major_version' from source: facts 28983 1726883070.58301: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883070.58501: variable 'network_provider' from source: set_fact 28983 1726883070.58524: Evaluated conditional (network_provider == "nm"): True 28983 1726883070.58727: variable '__network_wpa_supplicant_required' from source: role '' defaults 28983 1726883070.59038: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 28983 1726883070.59233: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883070.62339: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883070.62425: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883070.62473: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883070.62518: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883070.62556: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883070.62649: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883070.62757: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883070.62761: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883070.62800: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883070.62804: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883070.62960: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883070.62965: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883070.62968: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883070.63139: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883070.63144: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883070.63146: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883070.63149: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883070.63152: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883070.63180: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883070.63198: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883070.63380: variable 'network_connections' from source: include params 28983 1726883070.63394: variable 'interface' from source: play vars 28983 1726883070.63476: variable 'interface' from source: play vars 28983 1726883070.63566: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883070.63787: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883070.63844: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883070.63884: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883070.63980: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883070.64013: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883070.64056: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883070.64107: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883070.64142: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883070.64256: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883070.64545: variable 'network_connections' from source: include params 28983 1726883070.64550: variable 'interface' from source: play vars 28983 1726883070.64623: variable 'interface' from source: play vars 28983 1726883070.64681: Evaluated conditional (__network_wpa_supplicant_required): False 28983 1726883070.64685: when evaluation is False, skipping this task 28983 1726883070.64687: _execute() done 28983 1726883070.64690: dumping result to json 28983 1726883070.64693: done dumping result, returning 28983 1726883070.64704: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affe814-3a2d-b16d-c0a7-000000001847] 28983 1726883070.64714: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001847 28983 1726883070.64970: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001847 28983 1726883070.64976: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 28983 1726883070.65024: no more pending results, returning what we have 28983 1726883070.65028: results queue empty 28983 1726883070.65029: checking for any_errors_fatal 28983 1726883070.65057: done checking for any_errors_fatal 28983 1726883070.65058: checking for max_fail_percentage 28983 1726883070.65060: done checking for max_fail_percentage 28983 1726883070.65061: checking to see if all hosts have failed and the running result is not ok 28983 1726883070.65062: done checking to see if all hosts have failed 28983 1726883070.65063: getting the remaining hosts for this loop 28983 1726883070.65065: done getting the remaining hosts for this loop 28983 1726883070.65069: getting the next task for host managed_node2 28983 1726883070.65079: done getting next task for host managed_node2 28983 1726883070.65083: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 28983 1726883070.65108: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883070.65139: getting variables 28983 1726883070.65141: in VariableManager get_vars() 28983 1726883070.65185: Calling all_inventory to load vars for managed_node2 28983 1726883070.65189: Calling groups_inventory to load vars for managed_node2 28983 1726883070.65191: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883070.65201: Calling all_plugins_play to load vars for managed_node2 28983 1726883070.65203: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883070.65206: Calling groups_plugins_play to load vars for managed_node2 28983 1726883070.67063: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883070.68680: done with get_vars() 28983 1726883070.68703: done getting variables 28983 1726883070.68754: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:44:30 -0400 (0:00:00.119) 0:01:40.685 ****** 28983 1726883070.68785: entering _queue_task() for managed_node2/service 28983 1726883070.69026: worker is 1 (out of 1 available) 28983 1726883070.69042: exiting _queue_task() for managed_node2/service 28983 1726883070.69055: done queuing things up, now waiting for results queue to drain 28983 1726883070.69057: waiting for pending results... 28983 1726883070.69263: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 28983 1726883070.69384: in run() - task 0affe814-3a2d-b16d-c0a7-000000001848 28983 1726883070.69401: variable 'ansible_search_path' from source: unknown 28983 1726883070.69405: variable 'ansible_search_path' from source: unknown 28983 1726883070.69438: calling self._execute() 28983 1726883070.69529: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883070.69540: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883070.69550: variable 'omit' from source: magic vars 28983 1726883070.69873: variable 'ansible_distribution_major_version' from source: facts 28983 1726883070.69886: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883070.69991: variable 'network_provider' from source: set_fact 28983 1726883070.69998: Evaluated conditional (network_provider == "initscripts"): False 28983 1726883070.70001: when evaluation is False, skipping this task 28983 1726883070.70004: _execute() done 28983 1726883070.70009: dumping result to json 28983 1726883070.70013: done dumping result, returning 28983 1726883070.70021: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [0affe814-3a2d-b16d-c0a7-000000001848] 28983 1726883070.70027: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001848 28983 1726883070.70123: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001848 28983 1726883070.70126: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28983 1726883070.70204: no more pending results, returning what we have 28983 1726883070.70208: results queue empty 28983 1726883070.70209: checking for any_errors_fatal 28983 1726883070.70215: done checking for any_errors_fatal 28983 1726883070.70216: checking for max_fail_percentage 28983 1726883070.70217: done checking for max_fail_percentage 28983 1726883070.70218: checking to see if all hosts have failed and the running result is not ok 28983 1726883070.70219: done checking to see if all hosts have failed 28983 1726883070.70220: getting the remaining hosts for this loop 28983 1726883070.70222: done getting the remaining hosts for this loop 28983 1726883070.70226: getting the next task for host managed_node2 28983 1726883070.70236: done getting next task for host managed_node2 28983 1726883070.70240: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 28983 1726883070.70246: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883070.70268: getting variables 28983 1726883070.70270: in VariableManager get_vars() 28983 1726883070.70307: Calling all_inventory to load vars for managed_node2 28983 1726883070.70311: Calling groups_inventory to load vars for managed_node2 28983 1726883070.70313: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883070.70326: Calling all_plugins_play to load vars for managed_node2 28983 1726883070.70329: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883070.70332: Calling groups_plugins_play to load vars for managed_node2 28983 1726883070.71548: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883070.73257: done with get_vars() 28983 1726883070.73281: done getting variables 28983 1726883070.73328: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:44:30 -0400 (0:00:00.045) 0:01:40.731 ****** 28983 1726883070.73361: entering _queue_task() for managed_node2/copy 28983 1726883070.73588: worker is 1 (out of 1 available) 28983 1726883070.73601: exiting _queue_task() for managed_node2/copy 28983 1726883070.73613: done queuing things up, now waiting for results queue to drain 28983 1726883070.73615: waiting for pending results... 28983 1726883070.73817: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 28983 1726883070.73930: in run() - task 0affe814-3a2d-b16d-c0a7-000000001849 28983 1726883070.73943: variable 'ansible_search_path' from source: unknown 28983 1726883070.73949: variable 'ansible_search_path' from source: unknown 28983 1726883070.73983: calling self._execute() 28983 1726883070.74072: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883070.74079: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883070.74092: variable 'omit' from source: magic vars 28983 1726883070.74414: variable 'ansible_distribution_major_version' from source: facts 28983 1726883070.74426: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883070.74531: variable 'network_provider' from source: set_fact 28983 1726883070.74540: Evaluated conditional (network_provider == "initscripts"): False 28983 1726883070.74543: when evaluation is False, skipping this task 28983 1726883070.74546: _execute() done 28983 1726883070.74551: dumping result to json 28983 1726883070.74556: done dumping result, returning 28983 1726883070.74564: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affe814-3a2d-b16d-c0a7-000000001849] 28983 1726883070.74569: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001849 28983 1726883070.74675: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001849 28983 1726883070.74678: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 28983 1726883070.74728: no more pending results, returning what we have 28983 1726883070.74732: results queue empty 28983 1726883070.74735: checking for any_errors_fatal 28983 1726883070.74742: done checking for any_errors_fatal 28983 1726883070.74743: checking for max_fail_percentage 28983 1726883070.74745: done checking for max_fail_percentage 28983 1726883070.74746: checking to see if all hosts have failed and the running result is not ok 28983 1726883070.74746: done checking to see if all hosts have failed 28983 1726883070.74747: getting the remaining hosts for this loop 28983 1726883070.74749: done getting the remaining hosts for this loop 28983 1726883070.74753: getting the next task for host managed_node2 28983 1726883070.74762: done getting next task for host managed_node2 28983 1726883070.74766: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 28983 1726883070.74772: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883070.74792: getting variables 28983 1726883070.74793: in VariableManager get_vars() 28983 1726883070.74831: Calling all_inventory to load vars for managed_node2 28983 1726883070.74843: Calling groups_inventory to load vars for managed_node2 28983 1726883070.74846: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883070.74854: Calling all_plugins_play to load vars for managed_node2 28983 1726883070.74856: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883070.74859: Calling groups_plugins_play to load vars for managed_node2 28983 1726883070.76061: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883070.77663: done with get_vars() 28983 1726883070.77687: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:44:30 -0400 (0:00:00.043) 0:01:40.775 ****** 28983 1726883070.77755: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 28983 1726883070.77969: worker is 1 (out of 1 available) 28983 1726883070.77982: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 28983 1726883070.77995: done queuing things up, now waiting for results queue to drain 28983 1726883070.77997: waiting for pending results... 28983 1726883070.78184: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 28983 1726883070.78298: in run() - task 0affe814-3a2d-b16d-c0a7-00000000184a 28983 1726883070.78311: variable 'ansible_search_path' from source: unknown 28983 1726883070.78314: variable 'ansible_search_path' from source: unknown 28983 1726883070.78350: calling self._execute() 28983 1726883070.78430: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883070.78442: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883070.78454: variable 'omit' from source: magic vars 28983 1726883070.78760: variable 'ansible_distribution_major_version' from source: facts 28983 1726883070.78773: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883070.78784: variable 'omit' from source: magic vars 28983 1726883070.78837: variable 'omit' from source: magic vars 28983 1726883070.78973: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883070.80997: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883070.81049: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883070.81084: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883070.81115: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883070.81138: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883070.81204: variable 'network_provider' from source: set_fact 28983 1726883070.81314: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883070.81338: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883070.81359: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883070.81399: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883070.81411: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883070.81471: variable 'omit' from source: magic vars 28983 1726883070.81564: variable 'omit' from source: magic vars 28983 1726883070.81655: variable 'network_connections' from source: include params 28983 1726883070.81666: variable 'interface' from source: play vars 28983 1726883070.81720: variable 'interface' from source: play vars 28983 1726883070.81855: variable 'omit' from source: magic vars 28983 1726883070.81863: variable '__lsr_ansible_managed' from source: task vars 28983 1726883070.81914: variable '__lsr_ansible_managed' from source: task vars 28983 1726883070.82064: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 28983 1726883070.82248: Loaded config def from plugin (lookup/template) 28983 1726883070.82251: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 28983 1726883070.82282: File lookup term: get_ansible_managed.j2 28983 1726883070.82285: variable 'ansible_search_path' from source: unknown 28983 1726883070.82288: evaluation_path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 28983 1726883070.82302: search_path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 28983 1726883070.82316: variable 'ansible_search_path' from source: unknown 28983 1726883070.89041: variable 'ansible_managed' from source: unknown 28983 1726883070.89046: variable 'omit' from source: magic vars 28983 1726883070.89049: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883070.89052: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883070.89055: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883070.89081: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883070.89092: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883070.89127: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883070.89130: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883070.89133: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883070.89391: Set connection var ansible_connection to ssh 28983 1726883070.89394: Set connection var ansible_shell_executable to /bin/sh 28983 1726883070.89397: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883070.89399: Set connection var ansible_timeout to 10 28983 1726883070.89402: Set connection var ansible_pipelining to False 28983 1726883070.89404: Set connection var ansible_shell_type to sh 28983 1726883070.89406: variable 'ansible_shell_executable' from source: unknown 28983 1726883070.89408: variable 'ansible_connection' from source: unknown 28983 1726883070.89411: variable 'ansible_module_compression' from source: unknown 28983 1726883070.89413: variable 'ansible_shell_type' from source: unknown 28983 1726883070.89415: variable 'ansible_shell_executable' from source: unknown 28983 1726883070.89418: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883070.89420: variable 'ansible_pipelining' from source: unknown 28983 1726883070.89422: variable 'ansible_timeout' from source: unknown 28983 1726883070.89424: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883070.89499: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28983 1726883070.89510: variable 'omit' from source: magic vars 28983 1726883070.89513: starting attempt loop 28983 1726883070.89516: running the handler 28983 1726883070.89555: _low_level_execute_command(): starting 28983 1726883070.89559: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726883070.90226: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883070.90245: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883070.90262: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883070.90273: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883070.90290: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726883070.90298: stderr chunk (state=3): >>>debug2: match not found <<< 28983 1726883070.90308: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883070.90324: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28983 1726883070.90332: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.46.139 is address <<< 28983 1726883070.90342: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28983 1726883070.90352: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883070.90437: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883070.90445: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883070.90448: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726883070.90472: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883070.90492: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883070.90596: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883070.92360: stdout chunk (state=3): >>>/root <<< 28983 1726883070.92557: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883070.92560: stdout chunk (state=3): >>><<< 28983 1726883070.92563: stderr chunk (state=3): >>><<< 28983 1726883070.92677: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883070.92681: _low_level_execute_command(): starting 28983 1726883070.92684: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726883070.925873-32721-41532519038420 `" && echo ansible-tmp-1726883070.925873-32721-41532519038420="` echo /root/.ansible/tmp/ansible-tmp-1726883070.925873-32721-41532519038420 `" ) && sleep 0' 28983 1726883070.93255: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883070.93328: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883070.93371: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883070.93389: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883070.93491: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883070.95562: stdout chunk (state=3): >>>ansible-tmp-1726883070.925873-32721-41532519038420=/root/.ansible/tmp/ansible-tmp-1726883070.925873-32721-41532519038420 <<< 28983 1726883070.95740: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883070.95753: stderr chunk (state=3): >>><<< 28983 1726883070.95762: stdout chunk (state=3): >>><<< 28983 1726883070.95790: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726883070.925873-32721-41532519038420=/root/.ansible/tmp/ansible-tmp-1726883070.925873-32721-41532519038420 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883070.95852: variable 'ansible_module_compression' from source: unknown 28983 1726883070.95910: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 28983 1726883070.95960: variable 'ansible_facts' from source: unknown 28983 1726883070.96076: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726883070.925873-32721-41532519038420/AnsiballZ_network_connections.py 28983 1726883070.96268: Sending initial data 28983 1726883070.96274: Sent initial data (166 bytes) 28983 1726883070.96888: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883070.96906: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883070.96921: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883070.96940: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883070.96956: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726883070.96993: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883070.97008: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883070.97046: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883070.97116: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883070.97133: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883070.97156: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883070.97254: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883070.98973: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726883070.99058: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726883070.99130: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpaupteimg /root/.ansible/tmp/ansible-tmp-1726883070.925873-32721-41532519038420/AnsiballZ_network_connections.py <<< 28983 1726883070.99154: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726883070.925873-32721-41532519038420/AnsiballZ_network_connections.py" <<< 28983 1726883070.99225: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpaupteimg" to remote "/root/.ansible/tmp/ansible-tmp-1726883070.925873-32721-41532519038420/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726883070.925873-32721-41532519038420/AnsiballZ_network_connections.py" <<< 28983 1726883071.01344: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883071.01348: stdout chunk (state=3): >>><<< 28983 1726883071.01350: stderr chunk (state=3): >>><<< 28983 1726883071.01352: done transferring module to remote 28983 1726883071.01354: _low_level_execute_command(): starting 28983 1726883071.01357: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726883070.925873-32721-41532519038420/ /root/.ansible/tmp/ansible-tmp-1726883070.925873-32721-41532519038420/AnsiballZ_network_connections.py && sleep 0' 28983 1726883071.01822: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883071.01828: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883071.01862: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726883071.01866: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883071.01868: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883071.01980: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883071.01984: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883071.02194: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883071.04142: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883071.04194: stderr chunk (state=3): >>><<< 28983 1726883071.04207: stdout chunk (state=3): >>><<< 28983 1726883071.04391: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883071.04396: _low_level_execute_command(): starting 28983 1726883071.04399: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726883070.925873-32721-41532519038420/AnsiballZ_network_connections.py && sleep 0' 28983 1726883071.05111: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883071.05129: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883071.05150: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883071.05171: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883071.05200: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726883071.05312: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 28983 1726883071.05343: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883071.05363: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883071.05480: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883071.37578: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 2ca3cca4-edb7-40a1-9de5-195b63d4908d\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 28983 1726883071.40140: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. <<< 28983 1726883071.40146: stdout chunk (state=3): >>><<< 28983 1726883071.40149: stderr chunk (state=3): >>><<< 28983 1726883071.40152: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 2ca3cca4-edb7-40a1-9de5-195b63d4908d\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726883071.40154: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'persistent_state': 'present', 'type': 'bridge', 'ip': {'dhcp4': False, 'auto6': False}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726883070.925873-32721-41532519038420/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726883071.40175: _low_level_execute_command(): starting 28983 1726883071.40186: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726883070.925873-32721-41532519038420/ > /dev/null 2>&1 && sleep 0' 28983 1726883071.40830: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883071.40846: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883071.40861: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883071.40880: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883071.40951: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883071.41012: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883071.41033: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883071.41068: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883071.41164: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883071.43237: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883071.43261: stdout chunk (state=3): >>><<< 28983 1726883071.43264: stderr chunk (state=3): >>><<< 28983 1726883071.43282: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883071.43439: handler run complete 28983 1726883071.43442: attempt loop complete, returning result 28983 1726883071.43445: _execute() done 28983 1726883071.43447: dumping result to json 28983 1726883071.43449: done dumping result, returning 28983 1726883071.43452: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affe814-3a2d-b16d-c0a7-00000000184a] 28983 1726883071.43457: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000184a 28983 1726883071.43541: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000184a 28983 1726883071.43545: WORKER PROCESS EXITING changed: [managed_node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 2ca3cca4-edb7-40a1-9de5-195b63d4908d 28983 1726883071.43781: no more pending results, returning what we have 28983 1726883071.43785: results queue empty 28983 1726883071.43786: checking for any_errors_fatal 28983 1726883071.43793: done checking for any_errors_fatal 28983 1726883071.43794: checking for max_fail_percentage 28983 1726883071.43796: done checking for max_fail_percentage 28983 1726883071.43797: checking to see if all hosts have failed and the running result is not ok 28983 1726883071.43798: done checking to see if all hosts have failed 28983 1726883071.43799: getting the remaining hosts for this loop 28983 1726883071.43801: done getting the remaining hosts for this loop 28983 1726883071.43805: getting the next task for host managed_node2 28983 1726883071.43813: done getting next task for host managed_node2 28983 1726883071.43817: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 28983 1726883071.43822: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883071.43941: getting variables 28983 1726883071.43943: in VariableManager get_vars() 28983 1726883071.43994: Calling all_inventory to load vars for managed_node2 28983 1726883071.43998: Calling groups_inventory to load vars for managed_node2 28983 1726883071.44001: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883071.44011: Calling all_plugins_play to load vars for managed_node2 28983 1726883071.44015: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883071.44019: Calling groups_plugins_play to load vars for managed_node2 28983 1726883071.46678: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883071.49803: done with get_vars() 28983 1726883071.49849: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:44:31 -0400 (0:00:00.722) 0:01:41.497 ****** 28983 1726883071.49964: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 28983 1726883071.50458: worker is 1 (out of 1 available) 28983 1726883071.50471: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 28983 1726883071.50488: done queuing things up, now waiting for results queue to drain 28983 1726883071.50490: waiting for pending results... 28983 1726883071.50732: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 28983 1726883071.50940: in run() - task 0affe814-3a2d-b16d-c0a7-00000000184b 28983 1726883071.50945: variable 'ansible_search_path' from source: unknown 28983 1726883071.50948: variable 'ansible_search_path' from source: unknown 28983 1726883071.50976: calling self._execute() 28983 1726883071.51104: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883071.51139: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883071.51144: variable 'omit' from source: magic vars 28983 1726883071.51611: variable 'ansible_distribution_major_version' from source: facts 28983 1726883071.51630: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883071.51805: variable 'network_state' from source: role '' defaults 28983 1726883071.51939: Evaluated conditional (network_state != {}): False 28983 1726883071.51944: when evaluation is False, skipping this task 28983 1726883071.51947: _execute() done 28983 1726883071.51950: dumping result to json 28983 1726883071.51953: done dumping result, returning 28983 1726883071.51956: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [0affe814-3a2d-b16d-c0a7-00000000184b] 28983 1726883071.51959: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000184b 28983 1726883071.52028: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000184b 28983 1726883071.52033: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28983 1726883071.52100: no more pending results, returning what we have 28983 1726883071.52105: results queue empty 28983 1726883071.52106: checking for any_errors_fatal 28983 1726883071.52119: done checking for any_errors_fatal 28983 1726883071.52120: checking for max_fail_percentage 28983 1726883071.52122: done checking for max_fail_percentage 28983 1726883071.52124: checking to see if all hosts have failed and the running result is not ok 28983 1726883071.52125: done checking to see if all hosts have failed 28983 1726883071.52126: getting the remaining hosts for this loop 28983 1726883071.52129: done getting the remaining hosts for this loop 28983 1726883071.52136: getting the next task for host managed_node2 28983 1726883071.52326: done getting next task for host managed_node2 28983 1726883071.52330: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 28983 1726883071.52340: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883071.52360: getting variables 28983 1726883071.52362: in VariableManager get_vars() 28983 1726883071.52404: Calling all_inventory to load vars for managed_node2 28983 1726883071.52407: Calling groups_inventory to load vars for managed_node2 28983 1726883071.52410: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883071.52419: Calling all_plugins_play to load vars for managed_node2 28983 1726883071.52423: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883071.52431: Calling groups_plugins_play to load vars for managed_node2 28983 1726883071.54722: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883071.58035: done with get_vars() 28983 1726883071.58071: done getting variables 28983 1726883071.58152: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:44:31 -0400 (0:00:00.082) 0:01:41.579 ****** 28983 1726883071.58198: entering _queue_task() for managed_node2/debug 28983 1726883071.58601: worker is 1 (out of 1 available) 28983 1726883071.58614: exiting _queue_task() for managed_node2/debug 28983 1726883071.58629: done queuing things up, now waiting for results queue to drain 28983 1726883071.58631: waiting for pending results... 28983 1726883071.58942: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 28983 1726883071.59241: in run() - task 0affe814-3a2d-b16d-c0a7-00000000184c 28983 1726883071.59246: variable 'ansible_search_path' from source: unknown 28983 1726883071.59248: variable 'ansible_search_path' from source: unknown 28983 1726883071.59251: calling self._execute() 28983 1726883071.59312: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883071.59319: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883071.59332: variable 'omit' from source: magic vars 28983 1726883071.59817: variable 'ansible_distribution_major_version' from source: facts 28983 1726883071.59838: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883071.59845: variable 'omit' from source: magic vars 28983 1726883071.59929: variable 'omit' from source: magic vars 28983 1726883071.59980: variable 'omit' from source: magic vars 28983 1726883071.60026: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883071.60076: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883071.60103: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883071.60123: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883071.60137: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883071.60184: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883071.60188: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883071.60239: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883071.60320: Set connection var ansible_connection to ssh 28983 1726883071.60333: Set connection var ansible_shell_executable to /bin/sh 28983 1726883071.60345: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883071.60356: Set connection var ansible_timeout to 10 28983 1726883071.60364: Set connection var ansible_pipelining to False 28983 1726883071.60378: Set connection var ansible_shell_type to sh 28983 1726883071.60404: variable 'ansible_shell_executable' from source: unknown 28983 1726883071.60408: variable 'ansible_connection' from source: unknown 28983 1726883071.60411: variable 'ansible_module_compression' from source: unknown 28983 1726883071.60414: variable 'ansible_shell_type' from source: unknown 28983 1726883071.60539: variable 'ansible_shell_executable' from source: unknown 28983 1726883071.60543: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883071.60546: variable 'ansible_pipelining' from source: unknown 28983 1726883071.60548: variable 'ansible_timeout' from source: unknown 28983 1726883071.60551: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883071.60615: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883071.60628: variable 'omit' from source: magic vars 28983 1726883071.60636: starting attempt loop 28983 1726883071.60640: running the handler 28983 1726883071.60800: variable '__network_connections_result' from source: set_fact 28983 1726883071.60868: handler run complete 28983 1726883071.60897: attempt loop complete, returning result 28983 1726883071.60900: _execute() done 28983 1726883071.60903: dumping result to json 28983 1726883071.60909: done dumping result, returning 28983 1726883071.60927: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affe814-3a2d-b16d-c0a7-00000000184c] 28983 1726883071.60933: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000184c ok: [managed_node2] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 2ca3cca4-edb7-40a1-9de5-195b63d4908d" ] } 28983 1726883071.61115: no more pending results, returning what we have 28983 1726883071.61119: results queue empty 28983 1726883071.61120: checking for any_errors_fatal 28983 1726883071.61134: done checking for any_errors_fatal 28983 1726883071.61135: checking for max_fail_percentage 28983 1726883071.61137: done checking for max_fail_percentage 28983 1726883071.61140: checking to see if all hosts have failed and the running result is not ok 28983 1726883071.61141: done checking to see if all hosts have failed 28983 1726883071.61142: getting the remaining hosts for this loop 28983 1726883071.61144: done getting the remaining hosts for this loop 28983 1726883071.61150: getting the next task for host managed_node2 28983 1726883071.61159: done getting next task for host managed_node2 28983 1726883071.61164: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 28983 1726883071.61174: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883071.61189: getting variables 28983 1726883071.61191: in VariableManager get_vars() 28983 1726883071.61358: Calling all_inventory to load vars for managed_node2 28983 1726883071.61362: Calling groups_inventory to load vars for managed_node2 28983 1726883071.61365: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883071.61379: Calling all_plugins_play to load vars for managed_node2 28983 1726883071.61384: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883071.61388: Calling groups_plugins_play to load vars for managed_node2 28983 1726883071.61910: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000184c 28983 1726883071.61913: WORKER PROCESS EXITING 28983 1726883071.63771: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883071.66808: done with get_vars() 28983 1726883071.66854: done getting variables 28983 1726883071.66922: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:44:31 -0400 (0:00:00.087) 0:01:41.667 ****** 28983 1726883071.66971: entering _queue_task() for managed_node2/debug 28983 1726883071.67294: worker is 1 (out of 1 available) 28983 1726883071.67309: exiting _queue_task() for managed_node2/debug 28983 1726883071.67323: done queuing things up, now waiting for results queue to drain 28983 1726883071.67325: waiting for pending results... 28983 1726883071.67753: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 28983 1726883071.67813: in run() - task 0affe814-3a2d-b16d-c0a7-00000000184d 28983 1726883071.67838: variable 'ansible_search_path' from source: unknown 28983 1726883071.67848: variable 'ansible_search_path' from source: unknown 28983 1726883071.67896: calling self._execute() 28983 1726883071.68018: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883071.68031: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883071.68051: variable 'omit' from source: magic vars 28983 1726883071.68499: variable 'ansible_distribution_major_version' from source: facts 28983 1726883071.68522: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883071.68536: variable 'omit' from source: magic vars 28983 1726883071.68631: variable 'omit' from source: magic vars 28983 1726883071.68672: variable 'omit' from source: magic vars 28983 1726883071.68740: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883071.68776: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883071.68851: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883071.68855: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883071.68857: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883071.68889: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883071.68900: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883071.68912: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883071.69043: Set connection var ansible_connection to ssh 28983 1726883071.69061: Set connection var ansible_shell_executable to /bin/sh 28983 1726883071.69083: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883071.69097: Set connection var ansible_timeout to 10 28983 1726883071.69180: Set connection var ansible_pipelining to False 28983 1726883071.69183: Set connection var ansible_shell_type to sh 28983 1726883071.69186: variable 'ansible_shell_executable' from source: unknown 28983 1726883071.69188: variable 'ansible_connection' from source: unknown 28983 1726883071.69191: variable 'ansible_module_compression' from source: unknown 28983 1726883071.69193: variable 'ansible_shell_type' from source: unknown 28983 1726883071.69195: variable 'ansible_shell_executable' from source: unknown 28983 1726883071.69197: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883071.69199: variable 'ansible_pipelining' from source: unknown 28983 1726883071.69201: variable 'ansible_timeout' from source: unknown 28983 1726883071.69203: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883071.69366: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883071.69387: variable 'omit' from source: magic vars 28983 1726883071.69403: starting attempt loop 28983 1726883071.69411: running the handler 28983 1726883071.69471: variable '__network_connections_result' from source: set_fact 28983 1726883071.69571: variable '__network_connections_result' from source: set_fact 28983 1726883071.69739: handler run complete 28983 1726883071.69782: attempt loop complete, returning result 28983 1726883071.69829: _execute() done 28983 1726883071.69833: dumping result to json 28983 1726883071.69837: done dumping result, returning 28983 1726883071.69840: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affe814-3a2d-b16d-c0a7-00000000184d] 28983 1726883071.69842: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000184d ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 2ca3cca4-edb7-40a1-9de5-195b63d4908d\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 2ca3cca4-edb7-40a1-9de5-195b63d4908d" ] } } 28983 1726883071.70058: no more pending results, returning what we have 28983 1726883071.70062: results queue empty 28983 1726883071.70063: checking for any_errors_fatal 28983 1726883071.70072: done checking for any_errors_fatal 28983 1726883071.70073: checking for max_fail_percentage 28983 1726883071.70075: done checking for max_fail_percentage 28983 1726883071.70077: checking to see if all hosts have failed and the running result is not ok 28983 1726883071.70078: done checking to see if all hosts have failed 28983 1726883071.70079: getting the remaining hosts for this loop 28983 1726883071.70082: done getting the remaining hosts for this loop 28983 1726883071.70087: getting the next task for host managed_node2 28983 1726883071.70096: done getting next task for host managed_node2 28983 1726883071.70100: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 28983 1726883071.70108: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883071.70122: getting variables 28983 1726883071.70124: in VariableManager get_vars() 28983 1726883071.70464: Calling all_inventory to load vars for managed_node2 28983 1726883071.70468: Calling groups_inventory to load vars for managed_node2 28983 1726883071.70476: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000184d 28983 1726883071.70488: WORKER PROCESS EXITING 28983 1726883071.70483: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883071.70498: Calling all_plugins_play to load vars for managed_node2 28983 1726883071.70502: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883071.70506: Calling groups_plugins_play to load vars for managed_node2 28983 1726883071.72919: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883071.75657: done with get_vars() 28983 1726883071.75705: done getting variables 28983 1726883071.75787: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:44:31 -0400 (0:00:00.088) 0:01:41.756 ****** 28983 1726883071.75833: entering _queue_task() for managed_node2/debug 28983 1726883071.76467: worker is 1 (out of 1 available) 28983 1726883071.76479: exiting _queue_task() for managed_node2/debug 28983 1726883071.76491: done queuing things up, now waiting for results queue to drain 28983 1726883071.76493: waiting for pending results... 28983 1726883071.76594: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 28983 1726883071.76830: in run() - task 0affe814-3a2d-b16d-c0a7-00000000184e 28983 1726883071.76836: variable 'ansible_search_path' from source: unknown 28983 1726883071.76839: variable 'ansible_search_path' from source: unknown 28983 1726883071.76938: calling self._execute() 28983 1726883071.77005: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883071.77018: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883071.77036: variable 'omit' from source: magic vars 28983 1726883071.77486: variable 'ansible_distribution_major_version' from source: facts 28983 1726883071.77507: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883071.77653: variable 'network_state' from source: role '' defaults 28983 1726883071.77668: Evaluated conditional (network_state != {}): False 28983 1726883071.77676: when evaluation is False, skipping this task 28983 1726883071.77683: _execute() done 28983 1726883071.77805: dumping result to json 28983 1726883071.77810: done dumping result, returning 28983 1726883071.77813: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affe814-3a2d-b16d-c0a7-00000000184e] 28983 1726883071.77816: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000184e 28983 1726883071.77900: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000184e 28983 1726883071.77903: WORKER PROCESS EXITING skipping: [managed_node2] => { "false_condition": "network_state != {}" } 28983 1726883071.77971: no more pending results, returning what we have 28983 1726883071.77976: results queue empty 28983 1726883071.77977: checking for any_errors_fatal 28983 1726883071.77995: done checking for any_errors_fatal 28983 1726883071.77997: checking for max_fail_percentage 28983 1726883071.77999: done checking for max_fail_percentage 28983 1726883071.78001: checking to see if all hosts have failed and the running result is not ok 28983 1726883071.78002: done checking to see if all hosts have failed 28983 1726883071.78003: getting the remaining hosts for this loop 28983 1726883071.78005: done getting the remaining hosts for this loop 28983 1726883071.78011: getting the next task for host managed_node2 28983 1726883071.78022: done getting next task for host managed_node2 28983 1726883071.78028: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 28983 1726883071.78038: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883071.78068: getting variables 28983 1726883071.78070: in VariableManager get_vars() 28983 1726883071.78128: Calling all_inventory to load vars for managed_node2 28983 1726883071.78133: Calling groups_inventory to load vars for managed_node2 28983 1726883071.78372: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883071.78382: Calling all_plugins_play to load vars for managed_node2 28983 1726883071.78387: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883071.78391: Calling groups_plugins_play to load vars for managed_node2 28983 1726883071.80800: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883071.83932: done with get_vars() 28983 1726883071.83971: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:44:31 -0400 (0:00:00.082) 0:01:41.838 ****** 28983 1726883071.84089: entering _queue_task() for managed_node2/ping 28983 1726883071.84460: worker is 1 (out of 1 available) 28983 1726883071.84474: exiting _queue_task() for managed_node2/ping 28983 1726883071.84489: done queuing things up, now waiting for results queue to drain 28983 1726883071.84491: waiting for pending results... 28983 1726883071.84811: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 28983 1726883071.85140: in run() - task 0affe814-3a2d-b16d-c0a7-00000000184f 28983 1726883071.85145: variable 'ansible_search_path' from source: unknown 28983 1726883071.85148: variable 'ansible_search_path' from source: unknown 28983 1726883071.85151: calling self._execute() 28983 1726883071.85175: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883071.85189: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883071.85208: variable 'omit' from source: magic vars 28983 1726883071.85667: variable 'ansible_distribution_major_version' from source: facts 28983 1726883071.85688: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883071.85706: variable 'omit' from source: magic vars 28983 1726883071.85794: variable 'omit' from source: magic vars 28983 1726883071.85841: variable 'omit' from source: magic vars 28983 1726883071.85890: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883071.85938: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883071.85966: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883071.85989: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883071.86007: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883071.86059: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883071.86135: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883071.86141: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883071.86212: Set connection var ansible_connection to ssh 28983 1726883071.86233: Set connection var ansible_shell_executable to /bin/sh 28983 1726883071.86258: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883071.86274: Set connection var ansible_timeout to 10 28983 1726883071.86286: Set connection var ansible_pipelining to False 28983 1726883071.86294: Set connection var ansible_shell_type to sh 28983 1726883071.86325: variable 'ansible_shell_executable' from source: unknown 28983 1726883071.86336: variable 'ansible_connection' from source: unknown 28983 1726883071.86348: variable 'ansible_module_compression' from source: unknown 28983 1726883071.86359: variable 'ansible_shell_type' from source: unknown 28983 1726883071.86464: variable 'ansible_shell_executable' from source: unknown 28983 1726883071.86467: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883071.86469: variable 'ansible_pipelining' from source: unknown 28983 1726883071.86472: variable 'ansible_timeout' from source: unknown 28983 1726883071.86474: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883071.86717: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28983 1726883071.86741: variable 'omit' from source: magic vars 28983 1726883071.86752: starting attempt loop 28983 1726883071.86775: running the handler 28983 1726883071.86809: _low_level_execute_command(): starting 28983 1726883071.86822: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726883071.87767: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883071.87843: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883071.87884: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883071.88013: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883071.89776: stdout chunk (state=3): >>>/root <<< 28983 1726883071.89887: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883071.90141: stderr chunk (state=3): >>><<< 28983 1726883071.90146: stdout chunk (state=3): >>><<< 28983 1726883071.90149: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883071.90152: _low_level_execute_command(): starting 28983 1726883071.90156: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726883071.8997934-32755-105008960517107 `" && echo ansible-tmp-1726883071.8997934-32755-105008960517107="` echo /root/.ansible/tmp/ansible-tmp-1726883071.8997934-32755-105008960517107 `" ) && sleep 0' 28983 1726883071.90601: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883071.90605: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883071.90618: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883071.90637: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883071.90650: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726883071.90657: stderr chunk (state=3): >>>debug2: match not found <<< 28983 1726883071.90668: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883071.90684: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28983 1726883071.90694: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.46.139 is address <<< 28983 1726883071.90710: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28983 1726883071.90713: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883071.90722: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883071.90820: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883071.90823: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726883071.90825: stderr chunk (state=3): >>>debug2: match found <<< 28983 1726883071.90828: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883071.90846: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883071.90888: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883071.90957: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883071.92979: stdout chunk (state=3): >>>ansible-tmp-1726883071.8997934-32755-105008960517107=/root/.ansible/tmp/ansible-tmp-1726883071.8997934-32755-105008960517107 <<< 28983 1726883071.93181: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883071.93207: stdout chunk (state=3): >>><<< 28983 1726883071.93220: stderr chunk (state=3): >>><<< 28983 1726883071.93254: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726883071.8997934-32755-105008960517107=/root/.ansible/tmp/ansible-tmp-1726883071.8997934-32755-105008960517107 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883071.93332: variable 'ansible_module_compression' from source: unknown 28983 1726883071.93407: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 28983 1726883071.93443: variable 'ansible_facts' from source: unknown 28983 1726883071.93544: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726883071.8997934-32755-105008960517107/AnsiballZ_ping.py 28983 1726883071.93802: Sending initial data 28983 1726883071.93806: Sent initial data (153 bytes) 28983 1726883071.94640: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883071.94708: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883071.94761: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883071.94866: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883071.95068: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883071.96742: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726883071.96819: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726883071.96900: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmp8lyafbu1 /root/.ansible/tmp/ansible-tmp-1726883071.8997934-32755-105008960517107/AnsiballZ_ping.py <<< 28983 1726883071.96903: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726883071.8997934-32755-105008960517107/AnsiballZ_ping.py" <<< 28983 1726883071.96976: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmp8lyafbu1" to remote "/root/.ansible/tmp/ansible-tmp-1726883071.8997934-32755-105008960517107/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726883071.8997934-32755-105008960517107/AnsiballZ_ping.py" <<< 28983 1726883071.98091: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883071.98339: stderr chunk (state=3): >>><<< 28983 1726883071.98343: stdout chunk (state=3): >>><<< 28983 1726883071.98346: done transferring module to remote 28983 1726883071.98348: _low_level_execute_command(): starting 28983 1726883071.98351: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726883071.8997934-32755-105008960517107/ /root/.ansible/tmp/ansible-tmp-1726883071.8997934-32755-105008960517107/AnsiballZ_ping.py && sleep 0' 28983 1726883071.98807: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883071.98817: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883071.98828: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883071.98847: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883071.98860: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726883071.98868: stderr chunk (state=3): >>>debug2: match not found <<< 28983 1726883071.98882: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883071.98903: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28983 1726883071.98909: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.46.139 is address <<< 28983 1726883071.98912: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28983 1726883071.98923: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883071.98935: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883071.98954: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883071.98963: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726883071.98970: stderr chunk (state=3): >>>debug2: match found <<< 28983 1726883071.99099: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883071.99103: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883071.99175: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883072.01165: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883072.01169: stdout chunk (state=3): >>><<< 28983 1726883072.01174: stderr chunk (state=3): >>><<< 28983 1726883072.01192: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883072.01284: _low_level_execute_command(): starting 28983 1726883072.01288: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726883071.8997934-32755-105008960517107/AnsiballZ_ping.py && sleep 0' 28983 1726883072.01814: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883072.01830: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883072.01848: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883072.01955: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883072.01967: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883072.01987: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883072.02032: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883072.02145: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883072.19313: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 28983 1726883072.20942: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. <<< 28983 1726883072.20947: stdout chunk (state=3): >>><<< 28983 1726883072.20949: stderr chunk (state=3): >>><<< 28983 1726883072.20952: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726883072.20955: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726883071.8997934-32755-105008960517107/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726883072.20957: _low_level_execute_command(): starting 28983 1726883072.20960: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726883071.8997934-32755-105008960517107/ > /dev/null 2>&1 && sleep 0' 28983 1726883072.21612: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883072.21622: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883072.21635: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883072.21660: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883072.21674: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726883072.21694: stderr chunk (state=3): >>>debug2: match not found <<< 28983 1726883072.21704: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883072.21721: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28983 1726883072.21729: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.46.139 is address <<< 28983 1726883072.21738: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28983 1726883072.21750: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883072.21761: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883072.21775: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883072.21801: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883072.21886: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883072.21911: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883072.22004: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883072.24020: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883072.24024: stdout chunk (state=3): >>><<< 28983 1726883072.24032: stderr chunk (state=3): >>><<< 28983 1726883072.24071: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883072.24082: handler run complete 28983 1726883072.24104: attempt loop complete, returning result 28983 1726883072.24107: _execute() done 28983 1726883072.24110: dumping result to json 28983 1726883072.24116: done dumping result, returning 28983 1726883072.24127: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affe814-3a2d-b16d-c0a7-00000000184f] 28983 1726883072.24239: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000184f 28983 1726883072.24311: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000184f 28983 1726883072.24315: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "ping": "pong" } 28983 1726883072.24503: no more pending results, returning what we have 28983 1726883072.24508: results queue empty 28983 1726883072.24509: checking for any_errors_fatal 28983 1726883072.24517: done checking for any_errors_fatal 28983 1726883072.24518: checking for max_fail_percentage 28983 1726883072.24521: done checking for max_fail_percentage 28983 1726883072.24522: checking to see if all hosts have failed and the running result is not ok 28983 1726883072.24523: done checking to see if all hosts have failed 28983 1726883072.24524: getting the remaining hosts for this loop 28983 1726883072.24526: done getting the remaining hosts for this loop 28983 1726883072.24532: getting the next task for host managed_node2 28983 1726883072.24605: done getting next task for host managed_node2 28983 1726883072.24608: ^ task is: TASK: meta (role_complete) 28983 1726883072.24615: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883072.24631: getting variables 28983 1726883072.24633: in VariableManager get_vars() 28983 1726883072.24830: Calling all_inventory to load vars for managed_node2 28983 1726883072.24836: Calling groups_inventory to load vars for managed_node2 28983 1726883072.24839: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883072.24849: Calling all_plugins_play to load vars for managed_node2 28983 1726883072.24853: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883072.24857: Calling groups_plugins_play to load vars for managed_node2 28983 1726883072.27580: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883072.30768: done with get_vars() 28983 1726883072.30808: done getting variables 28983 1726883072.30913: done queuing things up, now waiting for results queue to drain 28983 1726883072.30916: results queue empty 28983 1726883072.30917: checking for any_errors_fatal 28983 1726883072.30920: done checking for any_errors_fatal 28983 1726883072.30921: checking for max_fail_percentage 28983 1726883072.30923: done checking for max_fail_percentage 28983 1726883072.30924: checking to see if all hosts have failed and the running result is not ok 28983 1726883072.30925: done checking to see if all hosts have failed 28983 1726883072.30926: getting the remaining hosts for this loop 28983 1726883072.30927: done getting the remaining hosts for this loop 28983 1726883072.30930: getting the next task for host managed_node2 28983 1726883072.30937: done getting next task for host managed_node2 28983 1726883072.30940: ^ task is: TASK: Show result 28983 1726883072.30943: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883072.30946: getting variables 28983 1726883072.30948: in VariableManager get_vars() 28983 1726883072.30965: Calling all_inventory to load vars for managed_node2 28983 1726883072.30968: Calling groups_inventory to load vars for managed_node2 28983 1726883072.30978: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883072.30984: Calling all_plugins_play to load vars for managed_node2 28983 1726883072.30987: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883072.30991: Calling groups_plugins_play to load vars for managed_node2 28983 1726883072.33278: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883072.36546: done with get_vars() 28983 1726883072.36592: done getting variables 28983 1726883072.36650: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show result] ************************************************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml:14 Friday 20 September 2024 21:44:32 -0400 (0:00:00.526) 0:01:42.364 ****** 28983 1726883072.36702: entering _queue_task() for managed_node2/debug 28983 1726883072.37205: worker is 1 (out of 1 available) 28983 1726883072.37219: exiting _queue_task() for managed_node2/debug 28983 1726883072.37342: done queuing things up, now waiting for results queue to drain 28983 1726883072.37345: waiting for pending results... 28983 1726883072.37569: running TaskExecutor() for managed_node2/TASK: Show result 28983 1726883072.37845: in run() - task 0affe814-3a2d-b16d-c0a7-0000000017d1 28983 1726883072.37849: variable 'ansible_search_path' from source: unknown 28983 1726883072.37852: variable 'ansible_search_path' from source: unknown 28983 1726883072.37855: calling self._execute() 28983 1726883072.37906: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883072.37914: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883072.37927: variable 'omit' from source: magic vars 28983 1726883072.38393: variable 'ansible_distribution_major_version' from source: facts 28983 1726883072.38408: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883072.38414: variable 'omit' from source: magic vars 28983 1726883072.38481: variable 'omit' from source: magic vars 28983 1726883072.38528: variable 'omit' from source: magic vars 28983 1726883072.38582: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883072.38626: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883072.38655: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883072.38677: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883072.38690: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883072.38729: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883072.38733: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883072.38839: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883072.38865: Set connection var ansible_connection to ssh 28983 1726883072.38884: Set connection var ansible_shell_executable to /bin/sh 28983 1726883072.38895: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883072.38906: Set connection var ansible_timeout to 10 28983 1726883072.38914: Set connection var ansible_pipelining to False 28983 1726883072.38916: Set connection var ansible_shell_type to sh 28983 1726883072.38949: variable 'ansible_shell_executable' from source: unknown 28983 1726883072.38952: variable 'ansible_connection' from source: unknown 28983 1726883072.38955: variable 'ansible_module_compression' from source: unknown 28983 1726883072.38960: variable 'ansible_shell_type' from source: unknown 28983 1726883072.38962: variable 'ansible_shell_executable' from source: unknown 28983 1726883072.38968: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883072.38978: variable 'ansible_pipelining' from source: unknown 28983 1726883072.38982: variable 'ansible_timeout' from source: unknown 28983 1726883072.38989: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883072.39163: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883072.39177: variable 'omit' from source: magic vars 28983 1726883072.39183: starting attempt loop 28983 1726883072.39186: running the handler 28983 1726883072.39339: variable '__network_connections_result' from source: set_fact 28983 1726883072.39347: variable '__network_connections_result' from source: set_fact 28983 1726883072.39511: handler run complete 28983 1726883072.39554: attempt loop complete, returning result 28983 1726883072.39558: _execute() done 28983 1726883072.39560: dumping result to json 28983 1726883072.39568: done dumping result, returning 28983 1726883072.39581: done running TaskExecutor() for managed_node2/TASK: Show result [0affe814-3a2d-b16d-c0a7-0000000017d1] 28983 1726883072.39588: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000017d1 28983 1726883072.39839: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000017d1 28983 1726883072.39843: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 2ca3cca4-edb7-40a1-9de5-195b63d4908d\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 2ca3cca4-edb7-40a1-9de5-195b63d4908d" ] } } 28983 1726883072.39930: no more pending results, returning what we have 28983 1726883072.39935: results queue empty 28983 1726883072.39936: checking for any_errors_fatal 28983 1726883072.39939: done checking for any_errors_fatal 28983 1726883072.39940: checking for max_fail_percentage 28983 1726883072.39942: done checking for max_fail_percentage 28983 1726883072.39943: checking to see if all hosts have failed and the running result is not ok 28983 1726883072.39944: done checking to see if all hosts have failed 28983 1726883072.39946: getting the remaining hosts for this loop 28983 1726883072.39948: done getting the remaining hosts for this loop 28983 1726883072.39952: getting the next task for host managed_node2 28983 1726883072.39964: done getting next task for host managed_node2 28983 1726883072.39967: ^ task is: TASK: Include network role 28983 1726883072.39972: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883072.39977: getting variables 28983 1726883072.39979: in VariableManager get_vars() 28983 1726883072.40017: Calling all_inventory to load vars for managed_node2 28983 1726883072.40020: Calling groups_inventory to load vars for managed_node2 28983 1726883072.40024: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883072.40157: Calling all_plugins_play to load vars for managed_node2 28983 1726883072.40162: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883072.40168: Calling groups_plugins_play to load vars for managed_node2 28983 1726883072.42423: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883072.51207: done with get_vars() 28983 1726883072.51254: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml:3 Friday 20 September 2024 21:44:32 -0400 (0:00:00.146) 0:01:42.511 ****** 28983 1726883072.51363: entering _queue_task() for managed_node2/include_role 28983 1726883072.51821: worker is 1 (out of 1 available) 28983 1726883072.51839: exiting _queue_task() for managed_node2/include_role 28983 1726883072.51852: done queuing things up, now waiting for results queue to drain 28983 1726883072.51854: waiting for pending results... 28983 1726883072.52142: running TaskExecutor() for managed_node2/TASK: Include network role 28983 1726883072.52327: in run() - task 0affe814-3a2d-b16d-c0a7-0000000017d5 28983 1726883072.52349: variable 'ansible_search_path' from source: unknown 28983 1726883072.52354: variable 'ansible_search_path' from source: unknown 28983 1726883072.52400: calling self._execute() 28983 1726883072.52532: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883072.52539: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883072.52542: variable 'omit' from source: magic vars 28983 1726883072.52942: variable 'ansible_distribution_major_version' from source: facts 28983 1726883072.52953: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883072.52959: _execute() done 28983 1726883072.52963: dumping result to json 28983 1726883072.52968: done dumping result, returning 28983 1726883072.52975: done running TaskExecutor() for managed_node2/TASK: Include network role [0affe814-3a2d-b16d-c0a7-0000000017d5] 28983 1726883072.52984: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000017d5 28983 1726883072.53116: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000017d5 28983 1726883072.53119: WORKER PROCESS EXITING 28983 1726883072.53159: no more pending results, returning what we have 28983 1726883072.53165: in VariableManager get_vars() 28983 1726883072.53214: Calling all_inventory to load vars for managed_node2 28983 1726883072.53217: Calling groups_inventory to load vars for managed_node2 28983 1726883072.53221: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883072.53239: Calling all_plugins_play to load vars for managed_node2 28983 1726883072.53243: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883072.53247: Calling groups_plugins_play to load vars for managed_node2 28983 1726883072.54512: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883072.56647: done with get_vars() 28983 1726883072.56668: variable 'ansible_search_path' from source: unknown 28983 1726883072.56669: variable 'ansible_search_path' from source: unknown 28983 1726883072.56789: variable 'omit' from source: magic vars 28983 1726883072.56824: variable 'omit' from source: magic vars 28983 1726883072.56837: variable 'omit' from source: magic vars 28983 1726883072.56840: we have included files to process 28983 1726883072.56841: generating all_blocks data 28983 1726883072.56844: done generating all_blocks data 28983 1726883072.56848: processing included file: fedora.linux_system_roles.network 28983 1726883072.56863: in VariableManager get_vars() 28983 1726883072.56875: done with get_vars() 28983 1726883072.56900: in VariableManager get_vars() 28983 1726883072.56914: done with get_vars() 28983 1726883072.56946: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 28983 1726883072.57047: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 28983 1726883072.57110: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 28983 1726883072.57487: in VariableManager get_vars() 28983 1726883072.57504: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 28983 1726883072.59724: iterating over new_blocks loaded from include file 28983 1726883072.59726: in VariableManager get_vars() 28983 1726883072.59743: done with get_vars() 28983 1726883072.59744: filtering new block on tags 28983 1726883072.59988: done filtering new block on tags 28983 1726883072.59991: in VariableManager get_vars() 28983 1726883072.60003: done with get_vars() 28983 1726883072.60004: filtering new block on tags 28983 1726883072.60017: done filtering new block on tags 28983 1726883072.60019: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node2 28983 1726883072.60023: extending task lists for all hosts with included blocks 28983 1726883072.60116: done extending task lists 28983 1726883072.60117: done processing included files 28983 1726883072.60118: results queue empty 28983 1726883072.60118: checking for any_errors_fatal 28983 1726883072.60122: done checking for any_errors_fatal 28983 1726883072.60123: checking for max_fail_percentage 28983 1726883072.60123: done checking for max_fail_percentage 28983 1726883072.60124: checking to see if all hosts have failed and the running result is not ok 28983 1726883072.60125: done checking to see if all hosts have failed 28983 1726883072.60125: getting the remaining hosts for this loop 28983 1726883072.60126: done getting the remaining hosts for this loop 28983 1726883072.60128: getting the next task for host managed_node2 28983 1726883072.60132: done getting next task for host managed_node2 28983 1726883072.60136: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 28983 1726883072.60138: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883072.60147: getting variables 28983 1726883072.60148: in VariableManager get_vars() 28983 1726883072.60160: Calling all_inventory to load vars for managed_node2 28983 1726883072.60162: Calling groups_inventory to load vars for managed_node2 28983 1726883072.60164: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883072.60168: Calling all_plugins_play to load vars for managed_node2 28983 1726883072.60170: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883072.60174: Calling groups_plugins_play to load vars for managed_node2 28983 1726883072.61637: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883072.63945: done with get_vars() 28983 1726883072.63967: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:44:32 -0400 (0:00:00.126) 0:01:42.638 ****** 28983 1726883072.64026: entering _queue_task() for managed_node2/include_tasks 28983 1726883072.64300: worker is 1 (out of 1 available) 28983 1726883072.64314: exiting _queue_task() for managed_node2/include_tasks 28983 1726883072.64327: done queuing things up, now waiting for results queue to drain 28983 1726883072.64329: waiting for pending results... 28983 1726883072.64527: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 28983 1726883072.64630: in run() - task 0affe814-3a2d-b16d-c0a7-0000000019bf 28983 1726883072.64646: variable 'ansible_search_path' from source: unknown 28983 1726883072.64650: variable 'ansible_search_path' from source: unknown 28983 1726883072.64688: calling self._execute() 28983 1726883072.64782: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883072.64786: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883072.64799: variable 'omit' from source: magic vars 28983 1726883072.65136: variable 'ansible_distribution_major_version' from source: facts 28983 1726883072.65155: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883072.65159: _execute() done 28983 1726883072.65162: dumping result to json 28983 1726883072.65166: done dumping result, returning 28983 1726883072.65194: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affe814-3a2d-b16d-c0a7-0000000019bf] 28983 1726883072.65198: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000019bf 28983 1726883072.65316: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000019bf 28983 1726883072.65319: WORKER PROCESS EXITING 28983 1726883072.65401: no more pending results, returning what we have 28983 1726883072.65406: in VariableManager get_vars() 28983 1726883072.65458: Calling all_inventory to load vars for managed_node2 28983 1726883072.65462: Calling groups_inventory to load vars for managed_node2 28983 1726883072.65465: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883072.65475: Calling all_plugins_play to load vars for managed_node2 28983 1726883072.65478: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883072.65482: Calling groups_plugins_play to load vars for managed_node2 28983 1726883072.67329: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883072.68930: done with get_vars() 28983 1726883072.68952: variable 'ansible_search_path' from source: unknown 28983 1726883072.68953: variable 'ansible_search_path' from source: unknown 28983 1726883072.68983: we have included files to process 28983 1726883072.68984: generating all_blocks data 28983 1726883072.68986: done generating all_blocks data 28983 1726883072.68988: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 28983 1726883072.68989: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 28983 1726883072.68991: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 28983 1726883072.69464: done processing included file 28983 1726883072.69467: iterating over new_blocks loaded from include file 28983 1726883072.69468: in VariableManager get_vars() 28983 1726883072.69488: done with get_vars() 28983 1726883072.69490: filtering new block on tags 28983 1726883072.69514: done filtering new block on tags 28983 1726883072.69516: in VariableManager get_vars() 28983 1726883072.69535: done with get_vars() 28983 1726883072.69536: filtering new block on tags 28983 1726883072.69574: done filtering new block on tags 28983 1726883072.69577: in VariableManager get_vars() 28983 1726883072.69597: done with get_vars() 28983 1726883072.69598: filtering new block on tags 28983 1726883072.69632: done filtering new block on tags 28983 1726883072.69633: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node2 28983 1726883072.69639: extending task lists for all hosts with included blocks 28983 1726883072.71015: done extending task lists 28983 1726883072.71017: done processing included files 28983 1726883072.71017: results queue empty 28983 1726883072.71018: checking for any_errors_fatal 28983 1726883072.71021: done checking for any_errors_fatal 28983 1726883072.71021: checking for max_fail_percentage 28983 1726883072.71022: done checking for max_fail_percentage 28983 1726883072.71023: checking to see if all hosts have failed and the running result is not ok 28983 1726883072.71023: done checking to see if all hosts have failed 28983 1726883072.71024: getting the remaining hosts for this loop 28983 1726883072.71025: done getting the remaining hosts for this loop 28983 1726883072.71027: getting the next task for host managed_node2 28983 1726883072.71031: done getting next task for host managed_node2 28983 1726883072.71033: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 28983 1726883072.71037: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883072.71047: getting variables 28983 1726883072.71048: in VariableManager get_vars() 28983 1726883072.71059: Calling all_inventory to load vars for managed_node2 28983 1726883072.71060: Calling groups_inventory to load vars for managed_node2 28983 1726883072.71062: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883072.71066: Calling all_plugins_play to load vars for managed_node2 28983 1726883072.71068: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883072.71070: Calling groups_plugins_play to load vars for managed_node2 28983 1726883072.72141: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883072.73784: done with get_vars() 28983 1726883072.73809: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:44:32 -0400 (0:00:00.098) 0:01:42.736 ****** 28983 1726883072.73871: entering _queue_task() for managed_node2/setup 28983 1726883072.74150: worker is 1 (out of 1 available) 28983 1726883072.74165: exiting _queue_task() for managed_node2/setup 28983 1726883072.74178: done queuing things up, now waiting for results queue to drain 28983 1726883072.74180: waiting for pending results... 28983 1726883072.74386: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 28983 1726883072.74517: in run() - task 0affe814-3a2d-b16d-c0a7-000000001a16 28983 1726883072.74530: variable 'ansible_search_path' from source: unknown 28983 1726883072.74533: variable 'ansible_search_path' from source: unknown 28983 1726883072.74567: calling self._execute() 28983 1726883072.74656: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883072.74662: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883072.74673: variable 'omit' from source: magic vars 28983 1726883072.75011: variable 'ansible_distribution_major_version' from source: facts 28983 1726883072.75024: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883072.75214: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883072.76988: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883072.77041: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883072.77071: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883072.77106: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883072.77130: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883072.77200: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883072.77227: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883072.77251: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883072.77287: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883072.77299: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883072.77349: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883072.77369: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883072.77392: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883072.77426: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883072.77438: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883072.77569: variable '__network_required_facts' from source: role '' defaults 28983 1726883072.77579: variable 'ansible_facts' from source: unknown 28983 1726883072.78284: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 28983 1726883072.78290: when evaluation is False, skipping this task 28983 1726883072.78294: _execute() done 28983 1726883072.78298: dumping result to json 28983 1726883072.78301: done dumping result, returning 28983 1726883072.78312: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affe814-3a2d-b16d-c0a7-000000001a16] 28983 1726883072.78315: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001a16 28983 1726883072.78400: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001a16 28983 1726883072.78403: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28983 1726883072.78471: no more pending results, returning what we have 28983 1726883072.78475: results queue empty 28983 1726883072.78476: checking for any_errors_fatal 28983 1726883072.78478: done checking for any_errors_fatal 28983 1726883072.78479: checking for max_fail_percentage 28983 1726883072.78482: done checking for max_fail_percentage 28983 1726883072.78483: checking to see if all hosts have failed and the running result is not ok 28983 1726883072.78484: done checking to see if all hosts have failed 28983 1726883072.78485: getting the remaining hosts for this loop 28983 1726883072.78487: done getting the remaining hosts for this loop 28983 1726883072.78491: getting the next task for host managed_node2 28983 1726883072.78501: done getting next task for host managed_node2 28983 1726883072.78506: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 28983 1726883072.78514: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883072.78546: getting variables 28983 1726883072.78548: in VariableManager get_vars() 28983 1726883072.78589: Calling all_inventory to load vars for managed_node2 28983 1726883072.78592: Calling groups_inventory to load vars for managed_node2 28983 1726883072.78595: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883072.78604: Calling all_plugins_play to load vars for managed_node2 28983 1726883072.78607: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883072.78616: Calling groups_plugins_play to load vars for managed_node2 28983 1726883072.79879: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883072.81497: done with get_vars() 28983 1726883072.81519: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:44:32 -0400 (0:00:00.077) 0:01:42.813 ****** 28983 1726883072.81600: entering _queue_task() for managed_node2/stat 28983 1726883072.81831: worker is 1 (out of 1 available) 28983 1726883072.81847: exiting _queue_task() for managed_node2/stat 28983 1726883072.81861: done queuing things up, now waiting for results queue to drain 28983 1726883072.81862: waiting for pending results... 28983 1726883072.82051: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 28983 1726883072.82186: in run() - task 0affe814-3a2d-b16d-c0a7-000000001a18 28983 1726883072.82200: variable 'ansible_search_path' from source: unknown 28983 1726883072.82204: variable 'ansible_search_path' from source: unknown 28983 1726883072.82238: calling self._execute() 28983 1726883072.82323: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883072.82328: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883072.82340: variable 'omit' from source: magic vars 28983 1726883072.82663: variable 'ansible_distribution_major_version' from source: facts 28983 1726883072.82674: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883072.82815: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883072.83042: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883072.83082: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883072.83112: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883072.83143: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883072.83217: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883072.83238: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883072.83261: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883072.83287: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883072.83365: variable '__network_is_ostree' from source: set_fact 28983 1726883072.83372: Evaluated conditional (not __network_is_ostree is defined): False 28983 1726883072.83378: when evaluation is False, skipping this task 28983 1726883072.83381: _execute() done 28983 1726883072.83386: dumping result to json 28983 1726883072.83393: done dumping result, returning 28983 1726883072.83401: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affe814-3a2d-b16d-c0a7-000000001a18] 28983 1726883072.83404: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001a18 28983 1726883072.83500: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001a18 28983 1726883072.83504: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 28983 1726883072.83563: no more pending results, returning what we have 28983 1726883072.83567: results queue empty 28983 1726883072.83568: checking for any_errors_fatal 28983 1726883072.83574: done checking for any_errors_fatal 28983 1726883072.83575: checking for max_fail_percentage 28983 1726883072.83577: done checking for max_fail_percentage 28983 1726883072.83578: checking to see if all hosts have failed and the running result is not ok 28983 1726883072.83579: done checking to see if all hosts have failed 28983 1726883072.83580: getting the remaining hosts for this loop 28983 1726883072.83582: done getting the remaining hosts for this loop 28983 1726883072.83586: getting the next task for host managed_node2 28983 1726883072.83593: done getting next task for host managed_node2 28983 1726883072.83597: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 28983 1726883072.83604: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883072.83625: getting variables 28983 1726883072.83626: in VariableManager get_vars() 28983 1726883072.83665: Calling all_inventory to load vars for managed_node2 28983 1726883072.83668: Calling groups_inventory to load vars for managed_node2 28983 1726883072.83671: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883072.83679: Calling all_plugins_play to load vars for managed_node2 28983 1726883072.83684: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883072.83687: Calling groups_plugins_play to load vars for managed_node2 28983 1726883072.85063: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883072.86632: done with get_vars() 28983 1726883072.86658: done getting variables 28983 1726883072.86702: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:44:32 -0400 (0:00:00.051) 0:01:42.865 ****** 28983 1726883072.86730: entering _queue_task() for managed_node2/set_fact 28983 1726883072.86955: worker is 1 (out of 1 available) 28983 1726883072.86969: exiting _queue_task() for managed_node2/set_fact 28983 1726883072.86981: done queuing things up, now waiting for results queue to drain 28983 1726883072.86983: waiting for pending results... 28983 1726883072.87168: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 28983 1726883072.87292: in run() - task 0affe814-3a2d-b16d-c0a7-000000001a19 28983 1726883072.87305: variable 'ansible_search_path' from source: unknown 28983 1726883072.87309: variable 'ansible_search_path' from source: unknown 28983 1726883072.87343: calling self._execute() 28983 1726883072.87431: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883072.87438: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883072.87450: variable 'omit' from source: magic vars 28983 1726883072.87763: variable 'ansible_distribution_major_version' from source: facts 28983 1726883072.87774: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883072.87913: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883072.88142: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883072.88183: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883072.88216: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883072.88246: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883072.88321: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883072.88343: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883072.88365: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883072.88390: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883072.88466: variable '__network_is_ostree' from source: set_fact 28983 1726883072.88473: Evaluated conditional (not __network_is_ostree is defined): False 28983 1726883072.88479: when evaluation is False, skipping this task 28983 1726883072.88483: _execute() done 28983 1726883072.88486: dumping result to json 28983 1726883072.88492: done dumping result, returning 28983 1726883072.88500: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affe814-3a2d-b16d-c0a7-000000001a19] 28983 1726883072.88505: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001a19 28983 1726883072.88599: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001a19 28983 1726883072.88602: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 28983 1726883072.88682: no more pending results, returning what we have 28983 1726883072.88686: results queue empty 28983 1726883072.88687: checking for any_errors_fatal 28983 1726883072.88692: done checking for any_errors_fatal 28983 1726883072.88693: checking for max_fail_percentage 28983 1726883072.88695: done checking for max_fail_percentage 28983 1726883072.88696: checking to see if all hosts have failed and the running result is not ok 28983 1726883072.88697: done checking to see if all hosts have failed 28983 1726883072.88698: getting the remaining hosts for this loop 28983 1726883072.88700: done getting the remaining hosts for this loop 28983 1726883072.88703: getting the next task for host managed_node2 28983 1726883072.88713: done getting next task for host managed_node2 28983 1726883072.88717: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 28983 1726883072.88724: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883072.88753: getting variables 28983 1726883072.88755: in VariableManager get_vars() 28983 1726883072.88792: Calling all_inventory to load vars for managed_node2 28983 1726883072.88794: Calling groups_inventory to load vars for managed_node2 28983 1726883072.88796: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883072.88802: Calling all_plugins_play to load vars for managed_node2 28983 1726883072.88805: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883072.88807: Calling groups_plugins_play to load vars for managed_node2 28983 1726883072.90032: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883072.91765: done with get_vars() 28983 1726883072.91790: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:44:32 -0400 (0:00:00.051) 0:01:42.916 ****** 28983 1726883072.91878: entering _queue_task() for managed_node2/service_facts 28983 1726883072.92158: worker is 1 (out of 1 available) 28983 1726883072.92173: exiting _queue_task() for managed_node2/service_facts 28983 1726883072.92186: done queuing things up, now waiting for results queue to drain 28983 1726883072.92188: waiting for pending results... 28983 1726883072.92396: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running 28983 1726883072.92518: in run() - task 0affe814-3a2d-b16d-c0a7-000000001a1b 28983 1726883072.92537: variable 'ansible_search_path' from source: unknown 28983 1726883072.92543: variable 'ansible_search_path' from source: unknown 28983 1726883072.92573: calling self._execute() 28983 1726883072.92669: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883072.92676: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883072.92688: variable 'omit' from source: magic vars 28983 1726883072.93027: variable 'ansible_distribution_major_version' from source: facts 28983 1726883072.93040: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883072.93048: variable 'omit' from source: magic vars 28983 1726883072.93118: variable 'omit' from source: magic vars 28983 1726883072.93149: variable 'omit' from source: magic vars 28983 1726883072.93191: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883072.93226: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883072.93246: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883072.93264: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883072.93278: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883072.93310: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883072.93314: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883072.93317: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883072.93401: Set connection var ansible_connection to ssh 28983 1726883072.93413: Set connection var ansible_shell_executable to /bin/sh 28983 1726883072.93423: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883072.93431: Set connection var ansible_timeout to 10 28983 1726883072.93439: Set connection var ansible_pipelining to False 28983 1726883072.93442: Set connection var ansible_shell_type to sh 28983 1726883072.93463: variable 'ansible_shell_executable' from source: unknown 28983 1726883072.93466: variable 'ansible_connection' from source: unknown 28983 1726883072.93469: variable 'ansible_module_compression' from source: unknown 28983 1726883072.93473: variable 'ansible_shell_type' from source: unknown 28983 1726883072.93480: variable 'ansible_shell_executable' from source: unknown 28983 1726883072.93482: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883072.93488: variable 'ansible_pipelining' from source: unknown 28983 1726883072.93491: variable 'ansible_timeout' from source: unknown 28983 1726883072.93497: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883072.93669: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28983 1726883072.93684: variable 'omit' from source: magic vars 28983 1726883072.93690: starting attempt loop 28983 1726883072.93693: running the handler 28983 1726883072.93706: _low_level_execute_command(): starting 28983 1726883072.93714: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726883072.94270: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883072.94277: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883072.94281: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883072.94283: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883072.94330: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883072.94337: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883072.94444: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883072.96224: stdout chunk (state=3): >>>/root <<< 28983 1726883072.96331: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883072.96391: stderr chunk (state=3): >>><<< 28983 1726883072.96395: stdout chunk (state=3): >>><<< 28983 1726883072.96417: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883072.96430: _low_level_execute_command(): starting 28983 1726883072.96437: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726883072.9641733-32791-92110522247703 `" && echo ansible-tmp-1726883072.9641733-32791-92110522247703="` echo /root/.ansible/tmp/ansible-tmp-1726883072.9641733-32791-92110522247703 `" ) && sleep 0' 28983 1726883072.96911: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883072.96917: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883072.96919: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883072.96930: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883072.96982: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883072.96989: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883072.97064: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883072.99069: stdout chunk (state=3): >>>ansible-tmp-1726883072.9641733-32791-92110522247703=/root/.ansible/tmp/ansible-tmp-1726883072.9641733-32791-92110522247703 <<< 28983 1726883072.99189: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883072.99244: stderr chunk (state=3): >>><<< 28983 1726883072.99248: stdout chunk (state=3): >>><<< 28983 1726883072.99264: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726883072.9641733-32791-92110522247703=/root/.ansible/tmp/ansible-tmp-1726883072.9641733-32791-92110522247703 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883072.99307: variable 'ansible_module_compression' from source: unknown 28983 1726883072.99351: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 28983 1726883072.99385: variable 'ansible_facts' from source: unknown 28983 1726883072.99450: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726883072.9641733-32791-92110522247703/AnsiballZ_service_facts.py 28983 1726883072.99572: Sending initial data 28983 1726883072.99576: Sent initial data (161 bytes) 28983 1726883073.00011: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883073.00051: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726883073.00055: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883073.00058: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883073.00060: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883073.00108: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883073.00111: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883073.00183: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883073.01800: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726883073.01867: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726883073.01933: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmph40jpl51 /root/.ansible/tmp/ansible-tmp-1726883072.9641733-32791-92110522247703/AnsiballZ_service_facts.py <<< 28983 1726883073.01938: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726883072.9641733-32791-92110522247703/AnsiballZ_service_facts.py" <<< 28983 1726883073.02000: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmph40jpl51" to remote "/root/.ansible/tmp/ansible-tmp-1726883072.9641733-32791-92110522247703/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726883072.9641733-32791-92110522247703/AnsiballZ_service_facts.py" <<< 28983 1726883073.02924: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883073.02998: stderr chunk (state=3): >>><<< 28983 1726883073.03002: stdout chunk (state=3): >>><<< 28983 1726883073.03020: done transferring module to remote 28983 1726883073.03029: _low_level_execute_command(): starting 28983 1726883073.03037: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726883072.9641733-32791-92110522247703/ /root/.ansible/tmp/ansible-tmp-1726883072.9641733-32791-92110522247703/AnsiballZ_service_facts.py && sleep 0' 28983 1726883073.03512: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883073.03516: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883073.03519: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883073.03521: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883073.03523: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883073.03574: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883073.03583: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883073.03658: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883073.05519: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883073.05572: stderr chunk (state=3): >>><<< 28983 1726883073.05579: stdout chunk (state=3): >>><<< 28983 1726883073.05594: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883073.05598: _low_level_execute_command(): starting 28983 1726883073.05604: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726883072.9641733-32791-92110522247703/AnsiballZ_service_facts.py && sleep 0' 28983 1726883073.06077: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883073.06081: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726883073.06084: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883073.06086: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883073.06088: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883073.06144: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883073.06157: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883073.06227: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883075.02660: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"network": {"name": "network", "state": "running", "status": "enabled", "source": "sysv"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "import-state.service": {"name": "import-state.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "ip6tables.service": {"name": "ip6tables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "iptables.service": {"name": "iptables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "generated", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state":<<< 28983 1726883075.02733: stdout chunk (state=3): >>> "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-login<<< 28983 1726883075.02777: stdout chunk (state=3): >>>d.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "loadmodules.service": {"name": "loadmodules.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service"<<< 28983 1726883075.02788: stdout chunk (state=3): >>>: {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 28983 1726883075.04404: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. <<< 28983 1726883075.04408: stdout chunk (state=3): >>><<< 28983 1726883075.04639: stderr chunk (state=3): >>><<< 28983 1726883075.04647: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"network": {"name": "network", "state": "running", "status": "enabled", "source": "sysv"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "import-state.service": {"name": "import-state.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "ip6tables.service": {"name": "ip6tables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "iptables.service": {"name": "iptables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "generated", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "loadmodules.service": {"name": "loadmodules.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726883075.05766: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726883072.9641733-32791-92110522247703/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726883075.05790: _low_level_execute_command(): starting 28983 1726883075.05801: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726883072.9641733-32791-92110522247703/ > /dev/null 2>&1 && sleep 0' 28983 1726883075.06503: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883075.06552: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883075.06568: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883075.06606: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883075.06684: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883075.06718: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883075.06827: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883075.08846: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883075.08856: stdout chunk (state=3): >>><<< 28983 1726883075.08869: stderr chunk (state=3): >>><<< 28983 1726883075.08891: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883075.08903: handler run complete 28983 1726883075.09226: variable 'ansible_facts' from source: unknown 28983 1726883075.09490: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883075.10369: variable 'ansible_facts' from source: unknown 28983 1726883075.10597: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883075.11003: attempt loop complete, returning result 28983 1726883075.11015: _execute() done 28983 1726883075.11023: dumping result to json 28983 1726883075.11119: done dumping result, returning 28983 1726883075.11133: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running [0affe814-3a2d-b16d-c0a7-000000001a1b] 28983 1726883075.11148: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001a1b 28983 1726883075.12580: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001a1b 28983 1726883075.12584: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28983 1726883075.12758: no more pending results, returning what we have 28983 1726883075.12761: results queue empty 28983 1726883075.12762: checking for any_errors_fatal 28983 1726883075.12768: done checking for any_errors_fatal 28983 1726883075.12769: checking for max_fail_percentage 28983 1726883075.12771: done checking for max_fail_percentage 28983 1726883075.12774: checking to see if all hosts have failed and the running result is not ok 28983 1726883075.12775: done checking to see if all hosts have failed 28983 1726883075.12776: getting the remaining hosts for this loop 28983 1726883075.12778: done getting the remaining hosts for this loop 28983 1726883075.12782: getting the next task for host managed_node2 28983 1726883075.12790: done getting next task for host managed_node2 28983 1726883075.12794: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 28983 1726883075.12802: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883075.12817: getting variables 28983 1726883075.12818: in VariableManager get_vars() 28983 1726883075.12862: Calling all_inventory to load vars for managed_node2 28983 1726883075.12865: Calling groups_inventory to load vars for managed_node2 28983 1726883075.12868: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883075.12882: Calling all_plugins_play to load vars for managed_node2 28983 1726883075.12886: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883075.12890: Calling groups_plugins_play to load vars for managed_node2 28983 1726883075.15384: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883075.19991: done with get_vars() 28983 1726883075.20029: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:44:35 -0400 (0:00:02.285) 0:01:45.202 ****** 28983 1726883075.20484: entering _queue_task() for managed_node2/package_facts 28983 1726883075.21078: worker is 1 (out of 1 available) 28983 1726883075.21090: exiting _queue_task() for managed_node2/package_facts 28983 1726883075.21106: done queuing things up, now waiting for results queue to drain 28983 1726883075.21108: waiting for pending results... 28983 1726883075.21469: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 28983 1726883075.21740: in run() - task 0affe814-3a2d-b16d-c0a7-000000001a1c 28983 1726883075.21746: variable 'ansible_search_path' from source: unknown 28983 1726883075.21749: variable 'ansible_search_path' from source: unknown 28983 1726883075.21752: calling self._execute() 28983 1726883075.21894: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883075.21912: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883075.21932: variable 'omit' from source: magic vars 28983 1726883075.22424: variable 'ansible_distribution_major_version' from source: facts 28983 1726883075.22457: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883075.22474: variable 'omit' from source: magic vars 28983 1726883075.22597: variable 'omit' from source: magic vars 28983 1726883075.22686: variable 'omit' from source: magic vars 28983 1726883075.22774: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883075.22791: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883075.22819: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883075.22847: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883075.22867: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883075.23102: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883075.23107: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883075.23110: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883075.23278: Set connection var ansible_connection to ssh 28983 1726883075.23282: Set connection var ansible_shell_executable to /bin/sh 28983 1726883075.23297: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883075.23318: Set connection var ansible_timeout to 10 28983 1726883075.23331: Set connection var ansible_pipelining to False 28983 1726883075.23341: Set connection var ansible_shell_type to sh 28983 1726883075.23376: variable 'ansible_shell_executable' from source: unknown 28983 1726883075.23387: variable 'ansible_connection' from source: unknown 28983 1726883075.23396: variable 'ansible_module_compression' from source: unknown 28983 1726883075.23404: variable 'ansible_shell_type' from source: unknown 28983 1726883075.23413: variable 'ansible_shell_executable' from source: unknown 28983 1726883075.23429: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883075.23444: variable 'ansible_pipelining' from source: unknown 28983 1726883075.23453: variable 'ansible_timeout' from source: unknown 28983 1726883075.23462: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883075.23725: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28983 1726883075.23760: variable 'omit' from source: magic vars 28983 1726883075.23781: starting attempt loop 28983 1726883075.23792: running the handler 28983 1726883075.23813: _low_level_execute_command(): starting 28983 1726883075.23829: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726883075.24759: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883075.24781: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883075.24806: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883075.24830: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883075.25024: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883075.26765: stdout chunk (state=3): >>>/root <<< 28983 1726883075.26975: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883075.26979: stdout chunk (state=3): >>><<< 28983 1726883075.26982: stderr chunk (state=3): >>><<< 28983 1726883075.27114: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883075.27118: _low_level_execute_command(): starting 28983 1726883075.27121: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726883075.2701116-32849-89092112904676 `" && echo ansible-tmp-1726883075.2701116-32849-89092112904676="` echo /root/.ansible/tmp/ansible-tmp-1726883075.2701116-32849-89092112904676 `" ) && sleep 0' 28983 1726883075.27657: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883075.27670: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883075.27679: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883075.27697: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883075.27712: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726883075.27720: stderr chunk (state=3): >>>debug2: match not found <<< 28983 1726883075.27730: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883075.27750: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28983 1726883075.27845: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883075.27850: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883075.27955: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883075.29992: stdout chunk (state=3): >>>ansible-tmp-1726883075.2701116-32849-89092112904676=/root/.ansible/tmp/ansible-tmp-1726883075.2701116-32849-89092112904676 <<< 28983 1726883075.30124: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883075.30175: stderr chunk (state=3): >>><<< 28983 1726883075.30179: stdout chunk (state=3): >>><<< 28983 1726883075.30187: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726883075.2701116-32849-89092112904676=/root/.ansible/tmp/ansible-tmp-1726883075.2701116-32849-89092112904676 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883075.30231: variable 'ansible_module_compression' from source: unknown 28983 1726883075.30341: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 28983 1726883075.30540: variable 'ansible_facts' from source: unknown 28983 1726883075.30568: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726883075.2701116-32849-89092112904676/AnsiballZ_package_facts.py 28983 1726883075.30846: Sending initial data 28983 1726883075.30850: Sent initial data (161 bytes) 28983 1726883075.31393: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883075.31400: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883075.31407: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726883075.31414: stderr chunk (state=3): >>>debug2: match not found <<< 28983 1726883075.31427: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883075.31452: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883075.31455: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883075.31508: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883075.31513: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883075.31584: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883075.33213: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 28983 1726883075.33223: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726883075.33283: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726883075.33353: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpamzez8_f /root/.ansible/tmp/ansible-tmp-1726883075.2701116-32849-89092112904676/AnsiballZ_package_facts.py <<< 28983 1726883075.33367: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726883075.2701116-32849-89092112904676/AnsiballZ_package_facts.py" <<< 28983 1726883075.33429: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpamzez8_f" to remote "/root/.ansible/tmp/ansible-tmp-1726883075.2701116-32849-89092112904676/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726883075.2701116-32849-89092112904676/AnsiballZ_package_facts.py" <<< 28983 1726883075.35723: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883075.35756: stderr chunk (state=3): >>><<< 28983 1726883075.35759: stdout chunk (state=3): >>><<< 28983 1726883075.35778: done transferring module to remote 28983 1726883075.35790: _low_level_execute_command(): starting 28983 1726883075.35795: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726883075.2701116-32849-89092112904676/ /root/.ansible/tmp/ansible-tmp-1726883075.2701116-32849-89092112904676/AnsiballZ_package_facts.py && sleep 0' 28983 1726883075.36206: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883075.36221: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883075.36225: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883075.36242: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883075.36305: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883075.36312: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883075.36384: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883075.38361: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883075.38365: stdout chunk (state=3): >>><<< 28983 1726883075.38368: stderr chunk (state=3): >>><<< 28983 1726883075.38475: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883075.38479: _low_level_execute_command(): starting 28983 1726883075.38484: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726883075.2701116-32849-89092112904676/AnsiballZ_package_facts.py && sleep 0' 28983 1726883075.39001: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883075.39017: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883075.39032: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883075.39062: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883075.39087: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883075.39112: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883075.39189: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883075.39199: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883075.39288: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883076.03219: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "sou<<< 28983 1726883076.03451: stdout chunk (state=3): >>>rce": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "9.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chkconfig": [{"name": "chkconfig", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts": [{"name": "initscripts", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "network-scripts": [{"name": "network-scripts", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 28983 1726883076.05263: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. <<< 28983 1726883076.05267: stdout chunk (state=3): >>><<< 28983 1726883076.05276: stderr chunk (state=3): >>><<< 28983 1726883076.05348: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "9.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chkconfig": [{"name": "chkconfig", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts": [{"name": "initscripts", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "network-scripts": [{"name": "network-scripts", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726883076.12186: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726883075.2701116-32849-89092112904676/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726883076.12217: _low_level_execute_command(): starting 28983 1726883076.12228: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726883075.2701116-32849-89092112904676/ > /dev/null 2>&1 && sleep 0' 28983 1726883076.12868: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883076.12888: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883076.12975: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 28983 1726883076.13053: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883076.13090: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883076.13108: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883076.13128: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883076.13278: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883076.15297: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883076.15308: stdout chunk (state=3): >>><<< 28983 1726883076.15321: stderr chunk (state=3): >>><<< 28983 1726883076.15345: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883076.15539: handler run complete 28983 1726883076.17501: variable 'ansible_facts' from source: unknown 28983 1726883076.18517: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883076.23580: variable 'ansible_facts' from source: unknown 28983 1726883076.24524: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883076.26449: attempt loop complete, returning result 28983 1726883076.26663: _execute() done 28983 1726883076.26745: dumping result to json 28983 1726883076.27226: done dumping result, returning 28983 1726883076.27305: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affe814-3a2d-b16d-c0a7-000000001a1c] 28983 1726883076.27316: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001a1c ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28983 1726883076.34420: no more pending results, returning what we have 28983 1726883076.34424: results queue empty 28983 1726883076.34425: checking for any_errors_fatal 28983 1726883076.34432: done checking for any_errors_fatal 28983 1726883076.34433: checking for max_fail_percentage 28983 1726883076.34496: done checking for max_fail_percentage 28983 1726883076.34498: checking to see if all hosts have failed and the running result is not ok 28983 1726883076.34499: done checking to see if all hosts have failed 28983 1726883076.34500: getting the remaining hosts for this loop 28983 1726883076.34501: done getting the remaining hosts for this loop 28983 1726883076.34506: getting the next task for host managed_node2 28983 1726883076.34516: done getting next task for host managed_node2 28983 1726883076.34521: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 28983 1726883076.34528: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883076.34542: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001a1c 28983 1726883076.34545: WORKER PROCESS EXITING 28983 1726883076.34561: getting variables 28983 1726883076.34563: in VariableManager get_vars() 28983 1726883076.34603: Calling all_inventory to load vars for managed_node2 28983 1726883076.34606: Calling groups_inventory to load vars for managed_node2 28983 1726883076.34609: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883076.34721: Calling all_plugins_play to load vars for managed_node2 28983 1726883076.34726: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883076.34730: Calling groups_plugins_play to load vars for managed_node2 28983 1726883076.36985: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883076.40093: done with get_vars() 28983 1726883076.40136: done getting variables 28983 1726883076.40208: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:44:36 -0400 (0:00:01.197) 0:01:46.400 ****** 28983 1726883076.40262: entering _queue_task() for managed_node2/debug 28983 1726883076.40624: worker is 1 (out of 1 available) 28983 1726883076.40840: exiting _queue_task() for managed_node2/debug 28983 1726883076.40853: done queuing things up, now waiting for results queue to drain 28983 1726883076.40855: waiting for pending results... 28983 1726883076.40988: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 28983 1726883076.41192: in run() - task 0affe814-3a2d-b16d-c0a7-0000000019c0 28983 1726883076.41204: variable 'ansible_search_path' from source: unknown 28983 1726883076.41212: variable 'ansible_search_path' from source: unknown 28983 1726883076.41302: calling self._execute() 28983 1726883076.41383: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883076.41397: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883076.41421: variable 'omit' from source: magic vars 28983 1726883076.41893: variable 'ansible_distribution_major_version' from source: facts 28983 1726883076.41912: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883076.41922: variable 'omit' from source: magic vars 28983 1726883076.42020: variable 'omit' from source: magic vars 28983 1726883076.42171: variable 'network_provider' from source: set_fact 28983 1726883076.42189: variable 'omit' from source: magic vars 28983 1726883076.42282: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883076.42295: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883076.42322: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883076.42349: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883076.42366: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883076.42415: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883076.42499: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883076.42503: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883076.42570: Set connection var ansible_connection to ssh 28983 1726883076.42591: Set connection var ansible_shell_executable to /bin/sh 28983 1726883076.42614: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883076.42629: Set connection var ansible_timeout to 10 28983 1726883076.42643: Set connection var ansible_pipelining to False 28983 1726883076.42651: Set connection var ansible_shell_type to sh 28983 1726883076.42685: variable 'ansible_shell_executable' from source: unknown 28983 1726883076.42693: variable 'ansible_connection' from source: unknown 28983 1726883076.42701: variable 'ansible_module_compression' from source: unknown 28983 1726883076.42713: variable 'ansible_shell_type' from source: unknown 28983 1726883076.42724: variable 'ansible_shell_executable' from source: unknown 28983 1726883076.42731: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883076.42743: variable 'ansible_pipelining' from source: unknown 28983 1726883076.42750: variable 'ansible_timeout' from source: unknown 28983 1726883076.42824: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883076.42945: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883076.42965: variable 'omit' from source: magic vars 28983 1726883076.42978: starting attempt loop 28983 1726883076.42986: running the handler 28983 1726883076.43049: handler run complete 28983 1726883076.43078: attempt loop complete, returning result 28983 1726883076.43088: _execute() done 28983 1726883076.43096: dumping result to json 28983 1726883076.43106: done dumping result, returning 28983 1726883076.43118: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [0affe814-3a2d-b16d-c0a7-0000000019c0] 28983 1726883076.43129: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000019c0 28983 1726883076.43336: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000019c0 28983 1726883076.43340: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: Using network provider: nm 28983 1726883076.43425: no more pending results, returning what we have 28983 1726883076.43429: results queue empty 28983 1726883076.43430: checking for any_errors_fatal 28983 1726883076.43448: done checking for any_errors_fatal 28983 1726883076.43450: checking for max_fail_percentage 28983 1726883076.43452: done checking for max_fail_percentage 28983 1726883076.43453: checking to see if all hosts have failed and the running result is not ok 28983 1726883076.43454: done checking to see if all hosts have failed 28983 1726883076.43455: getting the remaining hosts for this loop 28983 1726883076.43457: done getting the remaining hosts for this loop 28983 1726883076.43462: getting the next task for host managed_node2 28983 1726883076.43471: done getting next task for host managed_node2 28983 1726883076.43478: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 28983 1726883076.43485: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883076.43499: getting variables 28983 1726883076.43501: in VariableManager get_vars() 28983 1726883076.43654: Calling all_inventory to load vars for managed_node2 28983 1726883076.43658: Calling groups_inventory to load vars for managed_node2 28983 1726883076.43661: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883076.43674: Calling all_plugins_play to load vars for managed_node2 28983 1726883076.43678: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883076.43683: Calling groups_plugins_play to load vars for managed_node2 28983 1726883076.46168: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883076.49963: done with get_vars() 28983 1726883076.50005: done getting variables 28983 1726883076.50124: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:44:36 -0400 (0:00:00.099) 0:01:46.499 ****** 28983 1726883076.50181: entering _queue_task() for managed_node2/fail 28983 1726883076.50675: worker is 1 (out of 1 available) 28983 1726883076.50689: exiting _queue_task() for managed_node2/fail 28983 1726883076.50701: done queuing things up, now waiting for results queue to drain 28983 1726883076.50703: waiting for pending results... 28983 1726883076.51052: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 28983 1726883076.51143: in run() - task 0affe814-3a2d-b16d-c0a7-0000000019c1 28983 1726883076.51167: variable 'ansible_search_path' from source: unknown 28983 1726883076.51178: variable 'ansible_search_path' from source: unknown 28983 1726883076.51310: calling self._execute() 28983 1726883076.51348: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883076.51361: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883076.51380: variable 'omit' from source: magic vars 28983 1726883076.51839: variable 'ansible_distribution_major_version' from source: facts 28983 1726883076.51866: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883076.52036: variable 'network_state' from source: role '' defaults 28983 1726883076.52055: Evaluated conditional (network_state != {}): False 28983 1726883076.52074: when evaluation is False, skipping this task 28983 1726883076.52084: _execute() done 28983 1726883076.52093: dumping result to json 28983 1726883076.52102: done dumping result, returning 28983 1726883076.52112: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affe814-3a2d-b16d-c0a7-0000000019c1] 28983 1726883076.52123: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000019c1 28983 1726883076.52449: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000019c1 28983 1726883076.52453: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28983 1726883076.52510: no more pending results, returning what we have 28983 1726883076.52514: results queue empty 28983 1726883076.52515: checking for any_errors_fatal 28983 1726883076.52521: done checking for any_errors_fatal 28983 1726883076.52522: checking for max_fail_percentage 28983 1726883076.52524: done checking for max_fail_percentage 28983 1726883076.52525: checking to see if all hosts have failed and the running result is not ok 28983 1726883076.52526: done checking to see if all hosts have failed 28983 1726883076.52527: getting the remaining hosts for this loop 28983 1726883076.52529: done getting the remaining hosts for this loop 28983 1726883076.52533: getting the next task for host managed_node2 28983 1726883076.52544: done getting next task for host managed_node2 28983 1726883076.52548: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 28983 1726883076.52554: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883076.52584: getting variables 28983 1726883076.52585: in VariableManager get_vars() 28983 1726883076.52627: Calling all_inventory to load vars for managed_node2 28983 1726883076.52630: Calling groups_inventory to load vars for managed_node2 28983 1726883076.52632: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883076.53146: Calling all_plugins_play to load vars for managed_node2 28983 1726883076.53151: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883076.53155: Calling groups_plugins_play to load vars for managed_node2 28983 1726883076.57688: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883076.62926: done with get_vars() 28983 1726883076.63181: done getting variables 28983 1726883076.63465: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:44:36 -0400 (0:00:00.133) 0:01:46.632 ****** 28983 1726883076.63512: entering _queue_task() for managed_node2/fail 28983 1726883076.64103: worker is 1 (out of 1 available) 28983 1726883076.64120: exiting _queue_task() for managed_node2/fail 28983 1726883076.64338: done queuing things up, now waiting for results queue to drain 28983 1726883076.64341: waiting for pending results... 28983 1726883076.64757: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 28983 1726883076.65020: in run() - task 0affe814-3a2d-b16d-c0a7-0000000019c2 28983 1726883076.65086: variable 'ansible_search_path' from source: unknown 28983 1726883076.65240: variable 'ansible_search_path' from source: unknown 28983 1726883076.65244: calling self._execute() 28983 1726883076.65743: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883076.65748: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883076.65750: variable 'omit' from source: magic vars 28983 1726883076.66443: variable 'ansible_distribution_major_version' from source: facts 28983 1726883076.66563: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883076.66972: variable 'network_state' from source: role '' defaults 28983 1726883076.66991: Evaluated conditional (network_state != {}): False 28983 1726883076.67001: when evaluation is False, skipping this task 28983 1726883076.67009: _execute() done 28983 1726883076.67059: dumping result to json 28983 1726883076.67069: done dumping result, returning 28983 1726883076.67082: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affe814-3a2d-b16d-c0a7-0000000019c2] 28983 1726883076.67095: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000019c2 skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28983 1726883076.67312: no more pending results, returning what we have 28983 1726883076.67316: results queue empty 28983 1726883076.67317: checking for any_errors_fatal 28983 1726883076.67330: done checking for any_errors_fatal 28983 1726883076.67331: checking for max_fail_percentage 28983 1726883076.67335: done checking for max_fail_percentage 28983 1726883076.67336: checking to see if all hosts have failed and the running result is not ok 28983 1726883076.67337: done checking to see if all hosts have failed 28983 1726883076.67338: getting the remaining hosts for this loop 28983 1726883076.67341: done getting the remaining hosts for this loop 28983 1726883076.67346: getting the next task for host managed_node2 28983 1726883076.67357: done getting next task for host managed_node2 28983 1726883076.67362: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 28983 1726883076.67370: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883076.67408: getting variables 28983 1726883076.67410: in VariableManager get_vars() 28983 1726883076.67664: Calling all_inventory to load vars for managed_node2 28983 1726883076.67668: Calling groups_inventory to load vars for managed_node2 28983 1726883076.67671: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883076.67746: Calling all_plugins_play to load vars for managed_node2 28983 1726883076.67750: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883076.67754: Calling groups_plugins_play to load vars for managed_node2 28983 1726883076.68541: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000019c2 28983 1726883076.68544: WORKER PROCESS EXITING 28983 1726883076.71589: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883076.74644: done with get_vars() 28983 1726883076.74684: done getting variables 28983 1726883076.74756: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:44:36 -0400 (0:00:00.112) 0:01:46.745 ****** 28983 1726883076.74799: entering _queue_task() for managed_node2/fail 28983 1726883076.75165: worker is 1 (out of 1 available) 28983 1726883076.75182: exiting _queue_task() for managed_node2/fail 28983 1726883076.75196: done queuing things up, now waiting for results queue to drain 28983 1726883076.75198: waiting for pending results... 28983 1726883076.75520: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 28983 1726883076.75708: in run() - task 0affe814-3a2d-b16d-c0a7-0000000019c3 28983 1726883076.75731: variable 'ansible_search_path' from source: unknown 28983 1726883076.75742: variable 'ansible_search_path' from source: unknown 28983 1726883076.75794: calling self._execute() 28983 1726883076.75917: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883076.75930: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883076.75948: variable 'omit' from source: magic vars 28983 1726883076.76411: variable 'ansible_distribution_major_version' from source: facts 28983 1726883076.76437: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883076.76679: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883076.79478: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883076.79584: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883076.79635: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883076.79691: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883076.79726: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883076.79837: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883076.79887: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883076.79923: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883076.79985: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883076.80008: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883076.80137: variable 'ansible_distribution_major_version' from source: facts 28983 1726883076.80162: Evaluated conditional (ansible_distribution_major_version | int > 9): True 28983 1726883076.80335: variable 'ansible_distribution' from source: facts 28983 1726883076.80347: variable '__network_rh_distros' from source: role '' defaults 28983 1726883076.80362: Evaluated conditional (ansible_distribution in __network_rh_distros): False 28983 1726883076.80412: when evaluation is False, skipping this task 28983 1726883076.80415: _execute() done 28983 1726883076.80417: dumping result to json 28983 1726883076.80420: done dumping result, returning 28983 1726883076.80422: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affe814-3a2d-b16d-c0a7-0000000019c3] 28983 1726883076.80424: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000019c3 skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution in __network_rh_distros", "skip_reason": "Conditional result was False" } 28983 1726883076.80697: no more pending results, returning what we have 28983 1726883076.80701: results queue empty 28983 1726883076.80702: checking for any_errors_fatal 28983 1726883076.80712: done checking for any_errors_fatal 28983 1726883076.80713: checking for max_fail_percentage 28983 1726883076.80716: done checking for max_fail_percentage 28983 1726883076.80717: checking to see if all hosts have failed and the running result is not ok 28983 1726883076.80718: done checking to see if all hosts have failed 28983 1726883076.80719: getting the remaining hosts for this loop 28983 1726883076.80721: done getting the remaining hosts for this loop 28983 1726883076.80727: getting the next task for host managed_node2 28983 1726883076.80739: done getting next task for host managed_node2 28983 1726883076.80745: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 28983 1726883076.80751: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883076.80787: getting variables 28983 1726883076.80789: in VariableManager get_vars() 28983 1726883076.81122: Calling all_inventory to load vars for managed_node2 28983 1726883076.81126: Calling groups_inventory to load vars for managed_node2 28983 1726883076.81130: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883076.81139: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000019c3 28983 1726883076.81142: WORKER PROCESS EXITING 28983 1726883076.81152: Calling all_plugins_play to load vars for managed_node2 28983 1726883076.81156: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883076.81160: Calling groups_plugins_play to load vars for managed_node2 28983 1726883076.85035: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883076.95725: done with get_vars() 28983 1726883076.95769: done getting variables 28983 1726883076.95888: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:44:36 -0400 (0:00:00.211) 0:01:46.957 ****** 28983 1726883076.95947: entering _queue_task() for managed_node2/dnf 28983 1726883076.96422: worker is 1 (out of 1 available) 28983 1726883076.96556: exiting _queue_task() for managed_node2/dnf 28983 1726883076.96570: done queuing things up, now waiting for results queue to drain 28983 1726883076.96573: waiting for pending results... 28983 1726883076.96819: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 28983 1726883076.97012: in run() - task 0affe814-3a2d-b16d-c0a7-0000000019c4 28983 1726883076.97239: variable 'ansible_search_path' from source: unknown 28983 1726883076.97244: variable 'ansible_search_path' from source: unknown 28983 1726883076.97248: calling self._execute() 28983 1726883076.97253: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883076.97256: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883076.97259: variable 'omit' from source: magic vars 28983 1726883076.97730: variable 'ansible_distribution_major_version' from source: facts 28983 1726883076.97747: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883076.98089: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883077.02094: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883077.02189: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883077.02240: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883077.02293: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883077.02368: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883077.02604: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883077.02940: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883077.02944: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883077.02947: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883077.02966: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883077.03205: variable 'ansible_distribution' from source: facts 28983 1726883077.03208: variable 'ansible_distribution_major_version' from source: facts 28983 1726883077.03218: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 28983 1726883077.03575: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883077.03886: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883077.03914: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883077.03947: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883077.04003: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883077.04015: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883077.04102: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883077.04129: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883077.04193: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883077.04244: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883077.04263: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883077.04314: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883077.04344: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883077.04374: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883077.04425: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883077.04450: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883077.04669: variable 'network_connections' from source: include params 28983 1726883077.04685: variable 'interface' from source: play vars 28983 1726883077.04764: variable 'interface' from source: play vars 28983 1726883077.04853: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883077.05063: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883077.05207: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883077.05212: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883077.05215: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883077.05244: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883077.05270: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883077.05305: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883077.05425: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883077.05429: variable '__network_team_connections_defined' from source: role '' defaults 28983 1726883077.05729: variable 'network_connections' from source: include params 28983 1726883077.05733: variable 'interface' from source: play vars 28983 1726883077.05814: variable 'interface' from source: play vars 28983 1726883077.05841: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 28983 1726883077.05844: when evaluation is False, skipping this task 28983 1726883077.05849: _execute() done 28983 1726883077.05858: dumping result to json 28983 1726883077.05860: done dumping result, returning 28983 1726883077.05870: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affe814-3a2d-b16d-c0a7-0000000019c4] 28983 1726883077.05873: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000019c4 skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 28983 1726883077.06122: no more pending results, returning what we have 28983 1726883077.06126: results queue empty 28983 1726883077.06127: checking for any_errors_fatal 28983 1726883077.06135: done checking for any_errors_fatal 28983 1726883077.06136: checking for max_fail_percentage 28983 1726883077.06138: done checking for max_fail_percentage 28983 1726883077.06139: checking to see if all hosts have failed and the running result is not ok 28983 1726883077.06140: done checking to see if all hosts have failed 28983 1726883077.06141: getting the remaining hosts for this loop 28983 1726883077.06143: done getting the remaining hosts for this loop 28983 1726883077.06147: getting the next task for host managed_node2 28983 1726883077.06155: done getting next task for host managed_node2 28983 1726883077.06159: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 28983 1726883077.06164: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883077.06188: getting variables 28983 1726883077.06190: in VariableManager get_vars() 28983 1726883077.06231: Calling all_inventory to load vars for managed_node2 28983 1726883077.06297: Calling groups_inventory to load vars for managed_node2 28983 1726883077.06301: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883077.06313: Calling all_plugins_play to load vars for managed_node2 28983 1726883077.06316: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883077.06321: Calling groups_plugins_play to load vars for managed_node2 28983 1726883077.06842: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000019c4 28983 1726883077.06845: WORKER PROCESS EXITING 28983 1726883077.08627: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883077.11845: done with get_vars() 28983 1726883077.11887: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 28983 1726883077.11978: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:44:37 -0400 (0:00:00.160) 0:01:47.117 ****** 28983 1726883077.12018: entering _queue_task() for managed_node2/yum 28983 1726883077.12369: worker is 1 (out of 1 available) 28983 1726883077.12385: exiting _queue_task() for managed_node2/yum 28983 1726883077.12398: done queuing things up, now waiting for results queue to drain 28983 1726883077.12400: waiting for pending results... 28983 1726883077.12781: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 28983 1726883077.12901: in run() - task 0affe814-3a2d-b16d-c0a7-0000000019c5 28983 1726883077.12918: variable 'ansible_search_path' from source: unknown 28983 1726883077.12922: variable 'ansible_search_path' from source: unknown 28983 1726883077.12964: calling self._execute() 28983 1726883077.13094: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883077.13102: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883077.13114: variable 'omit' from source: magic vars 28983 1726883077.13575: variable 'ansible_distribution_major_version' from source: facts 28983 1726883077.13591: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883077.13853: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883077.16539: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883077.17442: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883077.17445: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883077.17448: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883077.17451: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883077.17454: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883077.17456: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883077.17458: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883077.17461: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883077.17463: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883077.17552: variable 'ansible_distribution_major_version' from source: facts 28983 1726883077.17566: Evaluated conditional (ansible_distribution_major_version | int < 8): False 28983 1726883077.17569: when evaluation is False, skipping this task 28983 1726883077.17572: _execute() done 28983 1726883077.17581: dumping result to json 28983 1726883077.17586: done dumping result, returning 28983 1726883077.17594: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affe814-3a2d-b16d-c0a7-0000000019c5] 28983 1726883077.17603: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000019c5 28983 1726883077.17722: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000019c5 28983 1726883077.17725: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 28983 1726883077.17792: no more pending results, returning what we have 28983 1726883077.17796: results queue empty 28983 1726883077.17797: checking for any_errors_fatal 28983 1726883077.17806: done checking for any_errors_fatal 28983 1726883077.17807: checking for max_fail_percentage 28983 1726883077.17810: done checking for max_fail_percentage 28983 1726883077.17811: checking to see if all hosts have failed and the running result is not ok 28983 1726883077.17812: done checking to see if all hosts have failed 28983 1726883077.17813: getting the remaining hosts for this loop 28983 1726883077.17815: done getting the remaining hosts for this loop 28983 1726883077.17820: getting the next task for host managed_node2 28983 1726883077.17831: done getting next task for host managed_node2 28983 1726883077.17838: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 28983 1726883077.17845: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883077.17877: getting variables 28983 1726883077.17879: in VariableManager get_vars() 28983 1726883077.17928: Calling all_inventory to load vars for managed_node2 28983 1726883077.17931: Calling groups_inventory to load vars for managed_node2 28983 1726883077.18043: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883077.18055: Calling all_plugins_play to load vars for managed_node2 28983 1726883077.18060: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883077.18063: Calling groups_plugins_play to load vars for managed_node2 28983 1726883077.20681: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883077.24309: done with get_vars() 28983 1726883077.24371: done getting variables 28983 1726883077.24443: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:44:37 -0400 (0:00:00.124) 0:01:47.242 ****** 28983 1726883077.24492: entering _queue_task() for managed_node2/fail 28983 1726883077.25031: worker is 1 (out of 1 available) 28983 1726883077.25048: exiting _queue_task() for managed_node2/fail 28983 1726883077.25061: done queuing things up, now waiting for results queue to drain 28983 1726883077.25063: waiting for pending results... 28983 1726883077.25374: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 28983 1726883077.25549: in run() - task 0affe814-3a2d-b16d-c0a7-0000000019c6 28983 1726883077.25553: variable 'ansible_search_path' from source: unknown 28983 1726883077.25556: variable 'ansible_search_path' from source: unknown 28983 1726883077.25561: calling self._execute() 28983 1726883077.25685: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883077.25694: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883077.25707: variable 'omit' from source: magic vars 28983 1726883077.26178: variable 'ansible_distribution_major_version' from source: facts 28983 1726883077.26195: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883077.26358: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883077.26641: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883077.30143: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883077.30147: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883077.30150: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883077.30153: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883077.30193: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883077.30284: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883077.30325: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883077.30358: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883077.30543: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883077.30547: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883077.30550: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883077.30552: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883077.30555: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883077.30608: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883077.30630: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883077.30870: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883077.30873: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883077.30876: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883077.30879: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883077.30882: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883077.31069: variable 'network_connections' from source: include params 28983 1726883077.31087: variable 'interface' from source: play vars 28983 1726883077.31171: variable 'interface' from source: play vars 28983 1726883077.31263: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883077.31539: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883077.31543: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883077.31574: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883077.31615: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883077.31670: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883077.31701: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883077.31738: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883077.31770: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883077.31831: variable '__network_team_connections_defined' from source: role '' defaults 28983 1726883077.32187: variable 'network_connections' from source: include params 28983 1726883077.32202: variable 'interface' from source: play vars 28983 1726883077.32272: variable 'interface' from source: play vars 28983 1726883077.32302: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 28983 1726883077.32308: when evaluation is False, skipping this task 28983 1726883077.32311: _execute() done 28983 1726883077.32313: dumping result to json 28983 1726883077.32319: done dumping result, returning 28983 1726883077.32328: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affe814-3a2d-b16d-c0a7-0000000019c6] 28983 1726883077.32335: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000019c6 28983 1726883077.32605: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000019c6 28983 1726883077.32609: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 28983 1726883077.32668: no more pending results, returning what we have 28983 1726883077.32674: results queue empty 28983 1726883077.32676: checking for any_errors_fatal 28983 1726883077.32682: done checking for any_errors_fatal 28983 1726883077.32683: checking for max_fail_percentage 28983 1726883077.32685: done checking for max_fail_percentage 28983 1726883077.32686: checking to see if all hosts have failed and the running result is not ok 28983 1726883077.32687: done checking to see if all hosts have failed 28983 1726883077.32688: getting the remaining hosts for this loop 28983 1726883077.32691: done getting the remaining hosts for this loop 28983 1726883077.32696: getting the next task for host managed_node2 28983 1726883077.32704: done getting next task for host managed_node2 28983 1726883077.32709: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 28983 1726883077.32715: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883077.32742: getting variables 28983 1726883077.32744: in VariableManager get_vars() 28983 1726883077.32795: Calling all_inventory to load vars for managed_node2 28983 1726883077.32798: Calling groups_inventory to load vars for managed_node2 28983 1726883077.32801: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883077.32811: Calling all_plugins_play to load vars for managed_node2 28983 1726883077.32815: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883077.32820: Calling groups_plugins_play to load vars for managed_node2 28983 1726883077.35748: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883077.43028: done with get_vars() 28983 1726883077.43271: done getting variables 28983 1726883077.43538: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:44:37 -0400 (0:00:00.190) 0:01:47.433 ****** 28983 1726883077.43587: entering _queue_task() for managed_node2/package 28983 1726883077.44455: worker is 1 (out of 1 available) 28983 1726883077.44471: exiting _queue_task() for managed_node2/package 28983 1726883077.44488: done queuing things up, now waiting for results queue to drain 28983 1726883077.44490: waiting for pending results... 28983 1726883077.45195: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 28983 1726883077.45618: in run() - task 0affe814-3a2d-b16d-c0a7-0000000019c7 28983 1726883077.46042: variable 'ansible_search_path' from source: unknown 28983 1726883077.46047: variable 'ansible_search_path' from source: unknown 28983 1726883077.46056: calling self._execute() 28983 1726883077.46059: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883077.46063: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883077.46065: variable 'omit' from source: magic vars 28983 1726883077.46503: variable 'ansible_distribution_major_version' from source: facts 28983 1726883077.46518: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883077.46779: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883077.47097: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883077.47155: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883077.47194: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883077.47431: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883077.47437: variable 'network_packages' from source: role '' defaults 28983 1726883077.47663: variable '__network_provider_setup' from source: role '' defaults 28983 1726883077.47678: variable '__network_service_name_default_nm' from source: role '' defaults 28983 1726883077.47976: variable '__network_service_name_default_nm' from source: role '' defaults 28983 1726883077.47987: variable '__network_packages_default_nm' from source: role '' defaults 28983 1726883077.48143: variable '__network_packages_default_nm' from source: role '' defaults 28983 1726883077.48571: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883077.53857: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883077.54024: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883077.54124: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883077.54232: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883077.54266: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883077.54378: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883077.54414: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883077.54752: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883077.54980: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883077.54984: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883077.55026: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883077.55055: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883077.55143: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883077.55247: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883077.55267: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883077.55945: variable '__network_packages_default_gobject_packages' from source: role '' defaults 28983 1726883077.56251: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883077.56275: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883077.56447: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883077.56520: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883077.56548: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883077.56722: variable 'ansible_python' from source: facts 28983 1726883077.56731: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 28983 1726883077.56854: variable '__network_wpa_supplicant_required' from source: role '' defaults 28983 1726883077.56975: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 28983 1726883077.57270: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883077.57273: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883077.57276: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883077.57332: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883077.57364: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883077.57452: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883077.57516: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883077.57561: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883077.57639: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883077.57662: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883077.57925: variable 'network_connections' from source: include params 28983 1726883077.57929: variable 'interface' from source: play vars 28983 1726883077.58010: variable 'interface' from source: play vars 28983 1726883077.58106: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883077.58176: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883077.58219: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883077.58285: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883077.58376: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883077.59118: variable 'network_connections' from source: include params 28983 1726883077.59241: variable 'interface' from source: play vars 28983 1726883077.59294: variable 'interface' from source: play vars 28983 1726883077.59345: variable '__network_packages_default_wireless' from source: role '' defaults 28983 1726883077.59453: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883077.59919: variable 'network_connections' from source: include params 28983 1726883077.59930: variable 'interface' from source: play vars 28983 1726883077.60019: variable 'interface' from source: play vars 28983 1726883077.60060: variable '__network_packages_default_team' from source: role '' defaults 28983 1726883077.60191: variable '__network_team_connections_defined' from source: role '' defaults 28983 1726883077.60706: variable 'network_connections' from source: include params 28983 1726883077.60741: variable 'interface' from source: play vars 28983 1726883077.60813: variable 'interface' from source: play vars 28983 1726883077.60900: variable '__network_service_name_default_initscripts' from source: role '' defaults 28983 1726883077.61040: variable '__network_service_name_default_initscripts' from source: role '' defaults 28983 1726883077.61044: variable '__network_packages_default_initscripts' from source: role '' defaults 28983 1726883077.61113: variable '__network_packages_default_initscripts' from source: role '' defaults 28983 1726883077.61476: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 28983 1726883077.62262: variable 'network_connections' from source: include params 28983 1726883077.62341: variable 'interface' from source: play vars 28983 1726883077.62361: variable 'interface' from source: play vars 28983 1726883077.62375: variable 'ansible_distribution' from source: facts 28983 1726883077.62385: variable '__network_rh_distros' from source: role '' defaults 28983 1726883077.62396: variable 'ansible_distribution_major_version' from source: facts 28983 1726883077.62417: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 28983 1726883077.62707: variable 'ansible_distribution' from source: facts 28983 1726883077.62718: variable '__network_rh_distros' from source: role '' defaults 28983 1726883077.62728: variable 'ansible_distribution_major_version' from source: facts 28983 1726883077.62741: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 28983 1726883077.63030: variable 'ansible_distribution' from source: facts 28983 1726883077.63044: variable '__network_rh_distros' from source: role '' defaults 28983 1726883077.63056: variable 'ansible_distribution_major_version' from source: facts 28983 1726883077.63236: variable 'network_provider' from source: set_fact 28983 1726883077.63248: variable 'ansible_facts' from source: unknown 28983 1726883077.64592: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 28983 1726883077.64601: when evaluation is False, skipping this task 28983 1726883077.64608: _execute() done 28983 1726883077.64615: dumping result to json 28983 1726883077.64623: done dumping result, returning 28983 1726883077.64638: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [0affe814-3a2d-b16d-c0a7-0000000019c7] 28983 1726883077.64651: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000019c7 28983 1726883077.64902: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000019c7 28983 1726883077.64906: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 28983 1726883077.65090: no more pending results, returning what we have 28983 1726883077.65094: results queue empty 28983 1726883077.65094: checking for any_errors_fatal 28983 1726883077.65101: done checking for any_errors_fatal 28983 1726883077.65102: checking for max_fail_percentage 28983 1726883077.65104: done checking for max_fail_percentage 28983 1726883077.65105: checking to see if all hosts have failed and the running result is not ok 28983 1726883077.65106: done checking to see if all hosts have failed 28983 1726883077.65107: getting the remaining hosts for this loop 28983 1726883077.65109: done getting the remaining hosts for this loop 28983 1726883077.65113: getting the next task for host managed_node2 28983 1726883077.65120: done getting next task for host managed_node2 28983 1726883077.65124: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 28983 1726883077.65130: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883077.65156: getting variables 28983 1726883077.65157: in VariableManager get_vars() 28983 1726883077.65207: Calling all_inventory to load vars for managed_node2 28983 1726883077.65211: Calling groups_inventory to load vars for managed_node2 28983 1726883077.65213: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883077.65223: Calling all_plugins_play to load vars for managed_node2 28983 1726883077.65226: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883077.65244: Calling groups_plugins_play to load vars for managed_node2 28983 1726883077.67913: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883077.71882: done with get_vars() 28983 1726883077.71917: done getting variables 28983 1726883077.71974: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:44:37 -0400 (0:00:00.284) 0:01:47.717 ****** 28983 1726883077.72006: entering _queue_task() for managed_node2/package 28983 1726883077.72287: worker is 1 (out of 1 available) 28983 1726883077.72330: exiting _queue_task() for managed_node2/package 28983 1726883077.72346: done queuing things up, now waiting for results queue to drain 28983 1726883077.72349: waiting for pending results... 28983 1726883077.72699: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 28983 1726883077.72941: in run() - task 0affe814-3a2d-b16d-c0a7-0000000019c8 28983 1726883077.72945: variable 'ansible_search_path' from source: unknown 28983 1726883077.72947: variable 'ansible_search_path' from source: unknown 28983 1726883077.72951: calling self._execute() 28983 1726883077.72998: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883077.73013: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883077.73030: variable 'omit' from source: magic vars 28983 1726883077.73526: variable 'ansible_distribution_major_version' from source: facts 28983 1726883077.73539: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883077.73713: variable 'network_state' from source: role '' defaults 28983 1726883077.73718: Evaluated conditional (network_state != {}): False 28983 1726883077.73721: when evaluation is False, skipping this task 28983 1726883077.73724: _execute() done 28983 1726883077.73726: dumping result to json 28983 1726883077.73728: done dumping result, returning 28983 1726883077.73767: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affe814-3a2d-b16d-c0a7-0000000019c8] 28983 1726883077.73771: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000019c8 28983 1726883077.73887: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000019c8 28983 1726883077.73890: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28983 1726883077.73963: no more pending results, returning what we have 28983 1726883077.73967: results queue empty 28983 1726883077.73968: checking for any_errors_fatal 28983 1726883077.73975: done checking for any_errors_fatal 28983 1726883077.73976: checking for max_fail_percentage 28983 1726883077.73978: done checking for max_fail_percentage 28983 1726883077.73979: checking to see if all hosts have failed and the running result is not ok 28983 1726883077.73980: done checking to see if all hosts have failed 28983 1726883077.73981: getting the remaining hosts for this loop 28983 1726883077.73982: done getting the remaining hosts for this loop 28983 1726883077.73986: getting the next task for host managed_node2 28983 1726883077.73994: done getting next task for host managed_node2 28983 1726883077.73998: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 28983 1726883077.74005: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883077.74030: getting variables 28983 1726883077.74032: in VariableManager get_vars() 28983 1726883077.74075: Calling all_inventory to load vars for managed_node2 28983 1726883077.74078: Calling groups_inventory to load vars for managed_node2 28983 1726883077.74082: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883077.74097: Calling all_plugins_play to load vars for managed_node2 28983 1726883077.74101: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883077.74106: Calling groups_plugins_play to load vars for managed_node2 28983 1726883077.76207: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883077.78258: done with get_vars() 28983 1726883077.78282: done getting variables 28983 1726883077.78328: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:44:37 -0400 (0:00:00.063) 0:01:47.781 ****** 28983 1726883077.78360: entering _queue_task() for managed_node2/package 28983 1726883077.78592: worker is 1 (out of 1 available) 28983 1726883077.78607: exiting _queue_task() for managed_node2/package 28983 1726883077.78620: done queuing things up, now waiting for results queue to drain 28983 1726883077.78622: waiting for pending results... 28983 1726883077.78828: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 28983 1726883077.79021: in run() - task 0affe814-3a2d-b16d-c0a7-0000000019c9 28983 1726883077.79031: variable 'ansible_search_path' from source: unknown 28983 1726883077.79037: variable 'ansible_search_path' from source: unknown 28983 1726883077.79083: calling self._execute() 28983 1726883077.79345: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883077.79350: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883077.79353: variable 'omit' from source: magic vars 28983 1726883077.79627: variable 'ansible_distribution_major_version' from source: facts 28983 1726883077.79642: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883077.79801: variable 'network_state' from source: role '' defaults 28983 1726883077.79816: Evaluated conditional (network_state != {}): False 28983 1726883077.79819: when evaluation is False, skipping this task 28983 1726883077.79822: _execute() done 28983 1726883077.79825: dumping result to json 28983 1726883077.79831: done dumping result, returning 28983 1726883077.79842: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affe814-3a2d-b16d-c0a7-0000000019c9] 28983 1726883077.79849: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000019c9 skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28983 1726883077.80084: no more pending results, returning what we have 28983 1726883077.80087: results queue empty 28983 1726883077.80088: checking for any_errors_fatal 28983 1726883077.80094: done checking for any_errors_fatal 28983 1726883077.80095: checking for max_fail_percentage 28983 1726883077.80097: done checking for max_fail_percentage 28983 1726883077.80098: checking to see if all hosts have failed and the running result is not ok 28983 1726883077.80098: done checking to see if all hosts have failed 28983 1726883077.80099: getting the remaining hosts for this loop 28983 1726883077.80103: done getting the remaining hosts for this loop 28983 1726883077.80107: getting the next task for host managed_node2 28983 1726883077.80114: done getting next task for host managed_node2 28983 1726883077.80118: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 28983 1726883077.80124: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883077.80148: getting variables 28983 1726883077.80150: in VariableManager get_vars() 28983 1726883077.80187: Calling all_inventory to load vars for managed_node2 28983 1726883077.80190: Calling groups_inventory to load vars for managed_node2 28983 1726883077.80192: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883077.80202: Calling all_plugins_play to load vars for managed_node2 28983 1726883077.80206: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883077.80210: Calling groups_plugins_play to load vars for managed_node2 28983 1726883077.80751: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000019c9 28983 1726883077.80754: WORKER PROCESS EXITING 28983 1726883077.82357: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883077.84341: done with get_vars() 28983 1726883077.84363: done getting variables 28983 1726883077.84410: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:44:37 -0400 (0:00:00.060) 0:01:47.842 ****** 28983 1726883077.84444: entering _queue_task() for managed_node2/service 28983 1726883077.84665: worker is 1 (out of 1 available) 28983 1726883077.84678: exiting _queue_task() for managed_node2/service 28983 1726883077.84691: done queuing things up, now waiting for results queue to drain 28983 1726883077.84693: waiting for pending results... 28983 1726883077.85025: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 28983 1726883077.85208: in run() - task 0affe814-3a2d-b16d-c0a7-0000000019ca 28983 1726883077.85211: variable 'ansible_search_path' from source: unknown 28983 1726883077.85215: variable 'ansible_search_path' from source: unknown 28983 1726883077.85218: calling self._execute() 28983 1726883077.85343: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883077.85358: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883077.85377: variable 'omit' from source: magic vars 28983 1726883077.85907: variable 'ansible_distribution_major_version' from source: facts 28983 1726883077.85933: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883077.86055: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883077.86223: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883077.87968: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883077.88029: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883077.88061: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883077.88094: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883077.88124: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883077.88188: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883077.88214: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883077.88240: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883077.88277: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883077.88290: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883077.88331: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883077.88356: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883077.88377: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883077.88410: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883077.88422: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883077.88464: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883077.88486: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883077.88507: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883077.88539: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883077.88554: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883077.88698: variable 'network_connections' from source: include params 28983 1726883077.88709: variable 'interface' from source: play vars 28983 1726883077.88766: variable 'interface' from source: play vars 28983 1726883077.88826: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883077.88960: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883077.89005: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883077.89030: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883077.89057: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883077.89093: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883077.89116: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883077.89138: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883077.89160: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883077.89202: variable '__network_team_connections_defined' from source: role '' defaults 28983 1726883077.89407: variable 'network_connections' from source: include params 28983 1726883077.89411: variable 'interface' from source: play vars 28983 1726883077.89466: variable 'interface' from source: play vars 28983 1726883077.89487: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 28983 1726883077.89491: when evaluation is False, skipping this task 28983 1726883077.89493: _execute() done 28983 1726883077.89498: dumping result to json 28983 1726883077.89501: done dumping result, returning 28983 1726883077.89509: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affe814-3a2d-b16d-c0a7-0000000019ca] 28983 1726883077.89515: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000019ca 28983 1726883077.89616: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000019ca 28983 1726883077.89625: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 28983 1726883077.89684: no more pending results, returning what we have 28983 1726883077.89688: results queue empty 28983 1726883077.89689: checking for any_errors_fatal 28983 1726883077.89694: done checking for any_errors_fatal 28983 1726883077.89695: checking for max_fail_percentage 28983 1726883077.89697: done checking for max_fail_percentage 28983 1726883077.89698: checking to see if all hosts have failed and the running result is not ok 28983 1726883077.89699: done checking to see if all hosts have failed 28983 1726883077.89700: getting the remaining hosts for this loop 28983 1726883077.89702: done getting the remaining hosts for this loop 28983 1726883077.89706: getting the next task for host managed_node2 28983 1726883077.89714: done getting next task for host managed_node2 28983 1726883077.89718: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 28983 1726883077.89723: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883077.89753: getting variables 28983 1726883077.89755: in VariableManager get_vars() 28983 1726883077.89797: Calling all_inventory to load vars for managed_node2 28983 1726883077.89800: Calling groups_inventory to load vars for managed_node2 28983 1726883077.89802: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883077.89811: Calling all_plugins_play to load vars for managed_node2 28983 1726883077.89814: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883077.89818: Calling groups_plugins_play to load vars for managed_node2 28983 1726883077.91065: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883077.92684: done with get_vars() 28983 1726883077.92706: done getting variables 28983 1726883077.92754: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:44:37 -0400 (0:00:00.083) 0:01:47.925 ****** 28983 1726883077.92784: entering _queue_task() for managed_node2/service 28983 1726883077.93004: worker is 1 (out of 1 available) 28983 1726883077.93018: exiting _queue_task() for managed_node2/service 28983 1726883077.93031: done queuing things up, now waiting for results queue to drain 28983 1726883077.93035: waiting for pending results... 28983 1726883077.93219: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 28983 1726883077.93328: in run() - task 0affe814-3a2d-b16d-c0a7-0000000019cb 28983 1726883077.93342: variable 'ansible_search_path' from source: unknown 28983 1726883077.93346: variable 'ansible_search_path' from source: unknown 28983 1726883077.93383: calling self._execute() 28983 1726883077.93464: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883077.93470: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883077.93486: variable 'omit' from source: magic vars 28983 1726883077.93789: variable 'ansible_distribution_major_version' from source: facts 28983 1726883077.93800: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883077.93938: variable 'network_provider' from source: set_fact 28983 1726883077.93945: variable 'network_state' from source: role '' defaults 28983 1726883077.93955: Evaluated conditional (network_provider == "nm" or network_state != {}): True 28983 1726883077.93961: variable 'omit' from source: magic vars 28983 1726883077.94014: variable 'omit' from source: magic vars 28983 1726883077.94045: variable 'network_service_name' from source: role '' defaults 28983 1726883077.94099: variable 'network_service_name' from source: role '' defaults 28983 1726883077.94188: variable '__network_provider_setup' from source: role '' defaults 28983 1726883077.94195: variable '__network_service_name_default_nm' from source: role '' defaults 28983 1726883077.94255: variable '__network_service_name_default_nm' from source: role '' defaults 28983 1726883077.94259: variable '__network_packages_default_nm' from source: role '' defaults 28983 1726883077.94310: variable '__network_packages_default_nm' from source: role '' defaults 28983 1726883077.94509: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883077.96206: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883077.96577: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883077.96606: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883077.96637: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883077.96662: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883077.96729: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883077.96757: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883077.96783: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883077.96815: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883077.96828: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883077.96879: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883077.96896: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883077.96916: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883077.96949: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883077.96963: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883077.97147: variable '__network_packages_default_gobject_packages' from source: role '' defaults 28983 1726883077.97245: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883077.97266: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883077.97290: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883077.97325: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883077.97339: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883077.97412: variable 'ansible_python' from source: facts 28983 1726883077.97428: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 28983 1726883077.97494: variable '__network_wpa_supplicant_required' from source: role '' defaults 28983 1726883077.97563: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 28983 1726883077.97671: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883077.97694: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883077.97714: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883077.97751: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883077.97764: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883077.97805: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883077.97828: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883077.97855: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883077.97885: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883077.97897: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883077.98012: variable 'network_connections' from source: include params 28983 1726883077.98019: variable 'interface' from source: play vars 28983 1726883077.98086: variable 'interface' from source: play vars 28983 1726883077.98171: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883077.98314: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883077.98366: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883077.98406: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883077.98441: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883077.98492: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883077.98520: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883077.98548: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883077.98577: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883077.98622: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883077.98846: variable 'network_connections' from source: include params 28983 1726883077.98853: variable 'interface' from source: play vars 28983 1726883077.98914: variable 'interface' from source: play vars 28983 1726883077.98944: variable '__network_packages_default_wireless' from source: role '' defaults 28983 1726883077.99008: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883077.99248: variable 'network_connections' from source: include params 28983 1726883077.99251: variable 'interface' from source: play vars 28983 1726883077.99314: variable 'interface' from source: play vars 28983 1726883077.99332: variable '__network_packages_default_team' from source: role '' defaults 28983 1726883077.99400: variable '__network_team_connections_defined' from source: role '' defaults 28983 1726883077.99638: variable 'network_connections' from source: include params 28983 1726883077.99642: variable 'interface' from source: play vars 28983 1726883077.99702: variable 'interface' from source: play vars 28983 1726883077.99748: variable '__network_service_name_default_initscripts' from source: role '' defaults 28983 1726883077.99797: variable '__network_service_name_default_initscripts' from source: role '' defaults 28983 1726883077.99805: variable '__network_packages_default_initscripts' from source: role '' defaults 28983 1726883077.99859: variable '__network_packages_default_initscripts' from source: role '' defaults 28983 1726883078.00038: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 28983 1726883078.00439: variable 'network_connections' from source: include params 28983 1726883078.00443: variable 'interface' from source: play vars 28983 1726883078.00499: variable 'interface' from source: play vars 28983 1726883078.00506: variable 'ansible_distribution' from source: facts 28983 1726883078.00511: variable '__network_rh_distros' from source: role '' defaults 28983 1726883078.00518: variable 'ansible_distribution_major_version' from source: facts 28983 1726883078.00530: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 28983 1726883078.00679: variable 'ansible_distribution' from source: facts 28983 1726883078.00683: variable '__network_rh_distros' from source: role '' defaults 28983 1726883078.00686: variable 'ansible_distribution_major_version' from source: facts 28983 1726883078.00691: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 28983 1726883078.00837: variable 'ansible_distribution' from source: facts 28983 1726883078.00840: variable '__network_rh_distros' from source: role '' defaults 28983 1726883078.00847: variable 'ansible_distribution_major_version' from source: facts 28983 1726883078.00876: variable 'network_provider' from source: set_fact 28983 1726883078.00896: variable 'omit' from source: magic vars 28983 1726883078.00920: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883078.00946: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883078.00962: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883078.00978: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883078.00987: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883078.01016: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883078.01019: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883078.01023: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883078.01103: Set connection var ansible_connection to ssh 28983 1726883078.01114: Set connection var ansible_shell_executable to /bin/sh 28983 1726883078.01123: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883078.01131: Set connection var ansible_timeout to 10 28983 1726883078.01140: Set connection var ansible_pipelining to False 28983 1726883078.01144: Set connection var ansible_shell_type to sh 28983 1726883078.01165: variable 'ansible_shell_executable' from source: unknown 28983 1726883078.01168: variable 'ansible_connection' from source: unknown 28983 1726883078.01170: variable 'ansible_module_compression' from source: unknown 28983 1726883078.01175: variable 'ansible_shell_type' from source: unknown 28983 1726883078.01178: variable 'ansible_shell_executable' from source: unknown 28983 1726883078.01182: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883078.01187: variable 'ansible_pipelining' from source: unknown 28983 1726883078.01189: variable 'ansible_timeout' from source: unknown 28983 1726883078.01195: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883078.01284: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883078.01295: variable 'omit' from source: magic vars 28983 1726883078.01301: starting attempt loop 28983 1726883078.01304: running the handler 28983 1726883078.01376: variable 'ansible_facts' from source: unknown 28983 1726883078.01976: _low_level_execute_command(): starting 28983 1726883078.01982: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726883078.02517: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883078.02521: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883078.02524: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration <<< 28983 1726883078.02526: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726883078.02528: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883078.02588: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883078.02591: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883078.02598: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883078.02675: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883078.04442: stdout chunk (state=3): >>>/root <<< 28983 1726883078.04553: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883078.04602: stderr chunk (state=3): >>><<< 28983 1726883078.04605: stdout chunk (state=3): >>><<< 28983 1726883078.04627: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883078.04641: _low_level_execute_command(): starting 28983 1726883078.04644: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726883078.046252-32940-6380131670829 `" && echo ansible-tmp-1726883078.046252-32940-6380131670829="` echo /root/.ansible/tmp/ansible-tmp-1726883078.046252-32940-6380131670829 `" ) && sleep 0' 28983 1726883078.05094: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883078.05098: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726883078.05100: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883078.05103: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883078.05153: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883078.05158: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883078.05231: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883078.07266: stdout chunk (state=3): >>>ansible-tmp-1726883078.046252-32940-6380131670829=/root/.ansible/tmp/ansible-tmp-1726883078.046252-32940-6380131670829 <<< 28983 1726883078.07391: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883078.07435: stderr chunk (state=3): >>><<< 28983 1726883078.07439: stdout chunk (state=3): >>><<< 28983 1726883078.07455: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726883078.046252-32940-6380131670829=/root/.ansible/tmp/ansible-tmp-1726883078.046252-32940-6380131670829 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883078.07480: variable 'ansible_module_compression' from source: unknown 28983 1726883078.07520: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 28983 1726883078.07578: variable 'ansible_facts' from source: unknown 28983 1726883078.07718: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726883078.046252-32940-6380131670829/AnsiballZ_systemd.py 28983 1726883078.07832: Sending initial data 28983 1726883078.07838: Sent initial data (153 bytes) 28983 1726883078.08290: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883078.08293: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883078.08299: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration <<< 28983 1726883078.08302: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883078.08352: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883078.08357: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883078.08426: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883078.10079: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 28983 1726883078.10085: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726883078.10149: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726883078.10217: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpzam2cq2x /root/.ansible/tmp/ansible-tmp-1726883078.046252-32940-6380131670829/AnsiballZ_systemd.py <<< 28983 1726883078.10225: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726883078.046252-32940-6380131670829/AnsiballZ_systemd.py" <<< 28983 1726883078.10286: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpzam2cq2x" to remote "/root/.ansible/tmp/ansible-tmp-1726883078.046252-32940-6380131670829/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726883078.046252-32940-6380131670829/AnsiballZ_systemd.py" <<< 28983 1726883078.12123: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883078.12178: stderr chunk (state=3): >>><<< 28983 1726883078.12182: stdout chunk (state=3): >>><<< 28983 1726883078.12198: done transferring module to remote 28983 1726883078.12207: _low_level_execute_command(): starting 28983 1726883078.12212: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726883078.046252-32940-6380131670829/ /root/.ansible/tmp/ansible-tmp-1726883078.046252-32940-6380131670829/AnsiballZ_systemd.py && sleep 0' 28983 1726883078.12619: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883078.12653: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726883078.12656: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883078.12658: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883078.12661: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883078.12706: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883078.12722: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883078.12794: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883078.14726: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883078.14730: stderr chunk (state=3): >>><<< 28983 1726883078.14732: stdout chunk (state=3): >>><<< 28983 1726883078.14748: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883078.14751: _low_level_execute_command(): starting 28983 1726883078.14756: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726883078.046252-32940-6380131670829/AnsiballZ_systemd.py && sleep 0' 28983 1726883078.15183: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883078.15187: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883078.15189: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883078.15191: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883078.15245: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883078.15248: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883078.15320: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883078.48330: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3307", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ExecMainStartTimestampMonotonic": "237115079", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3307", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4837", "MemoryCurrent": "4534272", "MemoryAvailable": "infinity", "CPUUsageNSec": "1648340000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target NetworkManager-wait-online.service cloud-init.service network.target network.service multi-user.target", "After": "systemd-journald.socket dbus-broker.service dbus.socket system.slice cloud-init-local.service sysinit.target network-pre.target basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:40:53 EDT", "StateChangeTimestampMonotonic": "816797668", "InactiveExitTimestamp": "Fri 2024-09-20 21:31:13 EDT", "InactiveExitTimestampMonotonic": "237115438", "ActiveEnterTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ActiveEnterTimestampMonotonic": "237210316", "ActiveExitTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ActiveExitTimestampMonotonic": "237082173", "InactiveEnterTimestamp": "Fri 2024-09-20 21:31:13 EDT", "InactiveEnterTimestampMonotonic": "237109124", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ConditionTimestampMonotonic": "237109810", "AssertTimestamp": "Fri 2024-09-20 21:31:13 EDT", "AssertTimestampMonotonic": "237109813", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ac5f6302898c463683c4bdf580ab7e3e", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 28983 1726883078.50331: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. <<< 28983 1726883078.50349: stdout chunk (state=3): >>><<< 28983 1726883078.50368: stderr chunk (state=3): >>><<< 28983 1726883078.50391: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3307", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ExecMainStartTimestampMonotonic": "237115079", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3307", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4837", "MemoryCurrent": "4534272", "MemoryAvailable": "infinity", "CPUUsageNSec": "1648340000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target NetworkManager-wait-online.service cloud-init.service network.target network.service multi-user.target", "After": "systemd-journald.socket dbus-broker.service dbus.socket system.slice cloud-init-local.service sysinit.target network-pre.target basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:40:53 EDT", "StateChangeTimestampMonotonic": "816797668", "InactiveExitTimestamp": "Fri 2024-09-20 21:31:13 EDT", "InactiveExitTimestampMonotonic": "237115438", "ActiveEnterTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ActiveEnterTimestampMonotonic": "237210316", "ActiveExitTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ActiveExitTimestampMonotonic": "237082173", "InactiveEnterTimestamp": "Fri 2024-09-20 21:31:13 EDT", "InactiveEnterTimestampMonotonic": "237109124", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ConditionTimestampMonotonic": "237109810", "AssertTimestamp": "Fri 2024-09-20 21:31:13 EDT", "AssertTimestampMonotonic": "237109813", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ac5f6302898c463683c4bdf580ab7e3e", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726883078.50711: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726883078.046252-32940-6380131670829/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726883078.50747: _low_level_execute_command(): starting 28983 1726883078.50768: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726883078.046252-32940-6380131670829/ > /dev/null 2>&1 && sleep 0' 28983 1726883078.51449: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883078.51464: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883078.51480: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883078.51517: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883078.51632: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 28983 1726883078.51644: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883078.51687: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883078.51764: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883078.54039: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883078.54042: stdout chunk (state=3): >>><<< 28983 1726883078.54045: stderr chunk (state=3): >>><<< 28983 1726883078.54048: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883078.54050: handler run complete 28983 1726883078.54056: attempt loop complete, returning result 28983 1726883078.54059: _execute() done 28983 1726883078.54061: dumping result to json 28983 1726883078.54063: done dumping result, returning 28983 1726883078.54065: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affe814-3a2d-b16d-c0a7-0000000019cb] 28983 1726883078.54067: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000019cb ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28983 1726883078.54485: no more pending results, returning what we have 28983 1726883078.54541: results queue empty 28983 1726883078.54543: checking for any_errors_fatal 28983 1726883078.54552: done checking for any_errors_fatal 28983 1726883078.54553: checking for max_fail_percentage 28983 1726883078.54555: done checking for max_fail_percentage 28983 1726883078.54556: checking to see if all hosts have failed and the running result is not ok 28983 1726883078.54557: done checking to see if all hosts have failed 28983 1726883078.54558: getting the remaining hosts for this loop 28983 1726883078.54561: done getting the remaining hosts for this loop 28983 1726883078.54566: getting the next task for host managed_node2 28983 1726883078.54577: done getting next task for host managed_node2 28983 1726883078.54582: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 28983 1726883078.54588: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883078.54752: getting variables 28983 1726883078.54754: in VariableManager get_vars() 28983 1726883078.54801: Calling all_inventory to load vars for managed_node2 28983 1726883078.54805: Calling groups_inventory to load vars for managed_node2 28983 1726883078.54808: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883078.54820: Calling all_plugins_play to load vars for managed_node2 28983 1726883078.54825: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883078.54830: Calling groups_plugins_play to load vars for managed_node2 28983 1726883078.55439: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000019cb 28983 1726883078.55444: WORKER PROCESS EXITING 28983 1726883078.57491: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883078.60240: done with get_vars() 28983 1726883078.60292: done getting variables 28983 1726883078.60375: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:44:38 -0400 (0:00:00.676) 0:01:48.602 ****** 28983 1726883078.60426: entering _queue_task() for managed_node2/service 28983 1726883078.60853: worker is 1 (out of 1 available) 28983 1726883078.60869: exiting _queue_task() for managed_node2/service 28983 1726883078.60886: done queuing things up, now waiting for results queue to drain 28983 1726883078.60888: waiting for pending results... 28983 1726883078.61267: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 28983 1726883078.61470: in run() - task 0affe814-3a2d-b16d-c0a7-0000000019cc 28983 1726883078.61474: variable 'ansible_search_path' from source: unknown 28983 1726883078.61481: variable 'ansible_search_path' from source: unknown 28983 1726883078.61527: calling self._execute() 28983 1726883078.61687: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883078.61691: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883078.61694: variable 'omit' from source: magic vars 28983 1726883078.62159: variable 'ansible_distribution_major_version' from source: facts 28983 1726883078.62179: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883078.62542: variable 'network_provider' from source: set_fact 28983 1726883078.62545: Evaluated conditional (network_provider == "nm"): True 28983 1726883078.62548: variable '__network_wpa_supplicant_required' from source: role '' defaults 28983 1726883078.62581: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 28983 1726883078.62812: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883078.65565: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883078.65661: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883078.65716: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883078.65764: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883078.65805: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883078.65927: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883078.65984: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883078.66032: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883078.66101: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883078.66142: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883078.66186: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883078.66206: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883078.66227: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883078.66263: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883078.66280: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883078.66315: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883078.66345: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883078.66368: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883078.66402: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883078.66414: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883078.66539: variable 'network_connections' from source: include params 28983 1726883078.66552: variable 'interface' from source: play vars 28983 1726883078.66615: variable 'interface' from source: play vars 28983 1726883078.66678: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883078.66819: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883078.66851: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883078.66878: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883078.66905: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883078.66947: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883078.66966: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883078.66988: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883078.67009: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883078.67056: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883078.67268: variable 'network_connections' from source: include params 28983 1726883078.67275: variable 'interface' from source: play vars 28983 1726883078.67326: variable 'interface' from source: play vars 28983 1726883078.67353: Evaluated conditional (__network_wpa_supplicant_required): False 28983 1726883078.67358: when evaluation is False, skipping this task 28983 1726883078.67361: _execute() done 28983 1726883078.67363: dumping result to json 28983 1726883078.67371: done dumping result, returning 28983 1726883078.67377: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affe814-3a2d-b16d-c0a7-0000000019cc] 28983 1726883078.67389: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000019cc 28983 1726883078.67488: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000019cc 28983 1726883078.67491: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 28983 1726883078.67548: no more pending results, returning what we have 28983 1726883078.67552: results queue empty 28983 1726883078.67552: checking for any_errors_fatal 28983 1726883078.67591: done checking for any_errors_fatal 28983 1726883078.67592: checking for max_fail_percentage 28983 1726883078.67595: done checking for max_fail_percentage 28983 1726883078.67596: checking to see if all hosts have failed and the running result is not ok 28983 1726883078.67597: done checking to see if all hosts have failed 28983 1726883078.67597: getting the remaining hosts for this loop 28983 1726883078.67600: done getting the remaining hosts for this loop 28983 1726883078.67606: getting the next task for host managed_node2 28983 1726883078.67614: done getting next task for host managed_node2 28983 1726883078.67619: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 28983 1726883078.67624: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883078.67659: getting variables 28983 1726883078.67661: in VariableManager get_vars() 28983 1726883078.67707: Calling all_inventory to load vars for managed_node2 28983 1726883078.67710: Calling groups_inventory to load vars for managed_node2 28983 1726883078.67712: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883078.67721: Calling all_plugins_play to load vars for managed_node2 28983 1726883078.67724: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883078.67728: Calling groups_plugins_play to load vars for managed_node2 28983 1726883078.69722: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883078.71509: done with get_vars() 28983 1726883078.71547: done getting variables 28983 1726883078.71622: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:44:38 -0400 (0:00:00.112) 0:01:48.714 ****** 28983 1726883078.71663: entering _queue_task() for managed_node2/service 28983 1726883078.72091: worker is 1 (out of 1 available) 28983 1726883078.72105: exiting _queue_task() for managed_node2/service 28983 1726883078.72118: done queuing things up, now waiting for results queue to drain 28983 1726883078.72119: waiting for pending results... 28983 1726883078.72452: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 28983 1726883078.72519: in run() - task 0affe814-3a2d-b16d-c0a7-0000000019cd 28983 1726883078.72549: variable 'ansible_search_path' from source: unknown 28983 1726883078.72559: variable 'ansible_search_path' from source: unknown 28983 1726883078.72606: calling self._execute() 28983 1726883078.72727: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883078.72757: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883078.72769: variable 'omit' from source: magic vars 28983 1726883078.73139: variable 'ansible_distribution_major_version' from source: facts 28983 1726883078.73150: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883078.73257: variable 'network_provider' from source: set_fact 28983 1726883078.73263: Evaluated conditional (network_provider == "initscripts"): False 28983 1726883078.73267: when evaluation is False, skipping this task 28983 1726883078.73270: _execute() done 28983 1726883078.73277: dumping result to json 28983 1726883078.73288: done dumping result, returning 28983 1726883078.73292: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [0affe814-3a2d-b16d-c0a7-0000000019cd] 28983 1726883078.73299: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000019cd 28983 1726883078.73402: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000019cd 28983 1726883078.73405: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28983 1726883078.73456: no more pending results, returning what we have 28983 1726883078.73460: results queue empty 28983 1726883078.73461: checking for any_errors_fatal 28983 1726883078.73474: done checking for any_errors_fatal 28983 1726883078.73475: checking for max_fail_percentage 28983 1726883078.73477: done checking for max_fail_percentage 28983 1726883078.73478: checking to see if all hosts have failed and the running result is not ok 28983 1726883078.73479: done checking to see if all hosts have failed 28983 1726883078.73480: getting the remaining hosts for this loop 28983 1726883078.73482: done getting the remaining hosts for this loop 28983 1726883078.73486: getting the next task for host managed_node2 28983 1726883078.73494: done getting next task for host managed_node2 28983 1726883078.73500: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 28983 1726883078.73507: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883078.73530: getting variables 28983 1726883078.73532: in VariableManager get_vars() 28983 1726883078.73572: Calling all_inventory to load vars for managed_node2 28983 1726883078.73575: Calling groups_inventory to load vars for managed_node2 28983 1726883078.73578: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883078.73587: Calling all_plugins_play to load vars for managed_node2 28983 1726883078.73590: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883078.73593: Calling groups_plugins_play to load vars for managed_node2 28983 1726883078.75126: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883078.77711: done with get_vars() 28983 1726883078.77738: done getting variables 28983 1726883078.77787: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:44:38 -0400 (0:00:00.061) 0:01:48.775 ****** 28983 1726883078.77816: entering _queue_task() for managed_node2/copy 28983 1726883078.78064: worker is 1 (out of 1 available) 28983 1726883078.78078: exiting _queue_task() for managed_node2/copy 28983 1726883078.78093: done queuing things up, now waiting for results queue to drain 28983 1726883078.78095: waiting for pending results... 28983 1726883078.78296: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 28983 1726883078.78418: in run() - task 0affe814-3a2d-b16d-c0a7-0000000019ce 28983 1726883078.78432: variable 'ansible_search_path' from source: unknown 28983 1726883078.78437: variable 'ansible_search_path' from source: unknown 28983 1726883078.78472: calling self._execute() 28983 1726883078.78561: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883078.78567: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883078.78581: variable 'omit' from source: magic vars 28983 1726883078.78913: variable 'ansible_distribution_major_version' from source: facts 28983 1726883078.78924: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883078.79033: variable 'network_provider' from source: set_fact 28983 1726883078.79039: Evaluated conditional (network_provider == "initscripts"): False 28983 1726883078.79044: when evaluation is False, skipping this task 28983 1726883078.79047: _execute() done 28983 1726883078.79052: dumping result to json 28983 1726883078.79057: done dumping result, returning 28983 1726883078.79065: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affe814-3a2d-b16d-c0a7-0000000019ce] 28983 1726883078.79071: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000019ce 28983 1726883078.79175: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000019ce 28983 1726883078.79178: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 28983 1726883078.79243: no more pending results, returning what we have 28983 1726883078.79247: results queue empty 28983 1726883078.79248: checking for any_errors_fatal 28983 1726883078.79256: done checking for any_errors_fatal 28983 1726883078.79257: checking for max_fail_percentage 28983 1726883078.79259: done checking for max_fail_percentage 28983 1726883078.79260: checking to see if all hosts have failed and the running result is not ok 28983 1726883078.79261: done checking to see if all hosts have failed 28983 1726883078.79262: getting the remaining hosts for this loop 28983 1726883078.79263: done getting the remaining hosts for this loop 28983 1726883078.79268: getting the next task for host managed_node2 28983 1726883078.79278: done getting next task for host managed_node2 28983 1726883078.79283: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 28983 1726883078.79289: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883078.79313: getting variables 28983 1726883078.79314: in VariableManager get_vars() 28983 1726883078.79355: Calling all_inventory to load vars for managed_node2 28983 1726883078.79358: Calling groups_inventory to load vars for managed_node2 28983 1726883078.79361: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883078.79370: Calling all_plugins_play to load vars for managed_node2 28983 1726883078.79374: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883078.79377: Calling groups_plugins_play to load vars for managed_node2 28983 1726883078.81244: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883078.82823: done with get_vars() 28983 1726883078.82848: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:44:38 -0400 (0:00:00.050) 0:01:48.826 ****** 28983 1726883078.82919: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 28983 1726883078.83154: worker is 1 (out of 1 available) 28983 1726883078.83168: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 28983 1726883078.83183: done queuing things up, now waiting for results queue to drain 28983 1726883078.83186: waiting for pending results... 28983 1726883078.83379: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 28983 1726883078.83494: in run() - task 0affe814-3a2d-b16d-c0a7-0000000019cf 28983 1726883078.83508: variable 'ansible_search_path' from source: unknown 28983 1726883078.83512: variable 'ansible_search_path' from source: unknown 28983 1726883078.83547: calling self._execute() 28983 1726883078.83633: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883078.83641: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883078.83652: variable 'omit' from source: magic vars 28983 1726883078.83970: variable 'ansible_distribution_major_version' from source: facts 28983 1726883078.83984: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883078.83991: variable 'omit' from source: magic vars 28983 1726883078.84044: variable 'omit' from source: magic vars 28983 1726883078.84185: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883078.85922: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883078.85977: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883078.86010: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883078.86044: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883078.86067: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883078.86139: variable 'network_provider' from source: set_fact 28983 1726883078.86249: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883078.86274: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883078.86297: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883078.86329: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883078.86344: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883078.86408: variable 'omit' from source: magic vars 28983 1726883078.86502: variable 'omit' from source: magic vars 28983 1726883078.86591: variable 'network_connections' from source: include params 28983 1726883078.86602: variable 'interface' from source: play vars 28983 1726883078.86655: variable 'interface' from source: play vars 28983 1726883078.86779: variable 'omit' from source: magic vars 28983 1726883078.86787: variable '__lsr_ansible_managed' from source: task vars 28983 1726883078.86842: variable '__lsr_ansible_managed' from source: task vars 28983 1726883078.87002: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 28983 1726883078.87188: Loaded config def from plugin (lookup/template) 28983 1726883078.87193: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 28983 1726883078.87217: File lookup term: get_ansible_managed.j2 28983 1726883078.87222: variable 'ansible_search_path' from source: unknown 28983 1726883078.87225: evaluation_path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 28983 1726883078.87244: search_path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 28983 1726883078.87257: variable 'ansible_search_path' from source: unknown 28983 1726883078.92962: variable 'ansible_managed' from source: unknown 28983 1726883078.93108: variable 'omit' from source: magic vars 28983 1726883078.93132: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883078.93157: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883078.93173: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883078.93193: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883078.93207: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883078.93227: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883078.93230: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883078.93238: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883078.93314: Set connection var ansible_connection to ssh 28983 1726883078.93327: Set connection var ansible_shell_executable to /bin/sh 28983 1726883078.93337: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883078.93345: Set connection var ansible_timeout to 10 28983 1726883078.93352: Set connection var ansible_pipelining to False 28983 1726883078.93354: Set connection var ansible_shell_type to sh 28983 1726883078.93377: variable 'ansible_shell_executable' from source: unknown 28983 1726883078.93380: variable 'ansible_connection' from source: unknown 28983 1726883078.93383: variable 'ansible_module_compression' from source: unknown 28983 1726883078.93387: variable 'ansible_shell_type' from source: unknown 28983 1726883078.93390: variable 'ansible_shell_executable' from source: unknown 28983 1726883078.93395: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883078.93400: variable 'ansible_pipelining' from source: unknown 28983 1726883078.93403: variable 'ansible_timeout' from source: unknown 28983 1726883078.93410: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883078.93521: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28983 1726883078.93534: variable 'omit' from source: magic vars 28983 1726883078.93539: starting attempt loop 28983 1726883078.93541: running the handler 28983 1726883078.93555: _low_level_execute_command(): starting 28983 1726883078.93562: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726883078.94105: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883078.94110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883078.94113: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883078.94167: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883078.94170: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883078.94174: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883078.94252: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883078.96052: stdout chunk (state=3): >>>/root <<< 28983 1726883078.96165: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883078.96214: stderr chunk (state=3): >>><<< 28983 1726883078.96217: stdout chunk (state=3): >>><<< 28983 1726883078.96238: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883078.96251: _low_level_execute_command(): starting 28983 1726883078.96254: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726883078.962384-32969-193277538286396 `" && echo ansible-tmp-1726883078.962384-32969-193277538286396="` echo /root/.ansible/tmp/ansible-tmp-1726883078.962384-32969-193277538286396 `" ) && sleep 0' 28983 1726883078.96695: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883078.96733: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883078.96740: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883078.96742: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883078.96744: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883078.96793: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883078.96800: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883078.96870: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883078.98911: stdout chunk (state=3): >>>ansible-tmp-1726883078.962384-32969-193277538286396=/root/.ansible/tmp/ansible-tmp-1726883078.962384-32969-193277538286396 <<< 28983 1726883078.99038: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883078.99083: stderr chunk (state=3): >>><<< 28983 1726883078.99086: stdout chunk (state=3): >>><<< 28983 1726883078.99103: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726883078.962384-32969-193277538286396=/root/.ansible/tmp/ansible-tmp-1726883078.962384-32969-193277538286396 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883078.99141: variable 'ansible_module_compression' from source: unknown 28983 1726883078.99178: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 28983 1726883078.99219: variable 'ansible_facts' from source: unknown 28983 1726883078.99318: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726883078.962384-32969-193277538286396/AnsiballZ_network_connections.py 28983 1726883078.99425: Sending initial data 28983 1726883078.99429: Sent initial data (167 bytes) 28983 1726883078.99889: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883078.99893: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883078.99899: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration <<< 28983 1726883078.99902: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883078.99948: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883078.99955: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883079.00026: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883079.01736: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726883079.01801: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726883079.01874: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpx4wcf7c5 /root/.ansible/tmp/ansible-tmp-1726883078.962384-32969-193277538286396/AnsiballZ_network_connections.py <<< 28983 1726883079.01877: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726883078.962384-32969-193277538286396/AnsiballZ_network_connections.py" <<< 28983 1726883079.01951: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpx4wcf7c5" to remote "/root/.ansible/tmp/ansible-tmp-1726883078.962384-32969-193277538286396/AnsiballZ_network_connections.py" <<< 28983 1726883079.01954: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726883078.962384-32969-193277538286396/AnsiballZ_network_connections.py" <<< 28983 1726883079.03184: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883079.03238: stderr chunk (state=3): >>><<< 28983 1726883079.03245: stdout chunk (state=3): >>><<< 28983 1726883079.03265: done transferring module to remote 28983 1726883079.03276: _low_level_execute_command(): starting 28983 1726883079.03279: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726883078.962384-32969-193277538286396/ /root/.ansible/tmp/ansible-tmp-1726883078.962384-32969-193277538286396/AnsiballZ_network_connections.py && sleep 0' 28983 1726883079.03697: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883079.03737: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883079.03741: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726883079.03743: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883079.03745: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883079.03748: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883079.03797: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883079.03804: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883079.03872: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883079.05815: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883079.05861: stderr chunk (state=3): >>><<< 28983 1726883079.05864: stdout chunk (state=3): >>><<< 28983 1726883079.05880: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883079.05884: _low_level_execute_command(): starting 28983 1726883079.05890: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726883078.962384-32969-193277538286396/AnsiballZ_network_connections.py && sleep 0' 28983 1726883079.06339: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883079.06370: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883079.06375: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883079.06378: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883079.06450: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883079.06454: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883079.06477: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883079.06563: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883079.35092: stdout chunk (state=3): >>> {"changed": false, "warnings": [], "stderr": "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 2ca3cca4-edb7-40a1-9de5-195b63d4908d skipped because already active\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 28983 1726883079.36945: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. <<< 28983 1726883079.36949: stdout chunk (state=3): >>><<< 28983 1726883079.36956: stderr chunk (state=3): >>><<< 28983 1726883079.37042: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "warnings": [], "stderr": "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 2ca3cca4-edb7-40a1-9de5-195b63d4908d skipped because already active\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726883079.37046: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'state': 'up'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726883078.962384-32969-193277538286396/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726883079.37049: _low_level_execute_command(): starting 28983 1726883079.37052: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726883078.962384-32969-193277538286396/ > /dev/null 2>&1 && sleep 0' 28983 1726883079.38318: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883079.38567: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883079.38652: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883079.38759: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883079.40750: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883079.40801: stderr chunk (state=3): >>><<< 28983 1726883079.40807: stdout chunk (state=3): >>><<< 28983 1726883079.40928: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883079.40932: handler run complete 28983 1726883079.40937: attempt loop complete, returning result 28983 1726883079.40939: _execute() done 28983 1726883079.40941: dumping result to json 28983 1726883079.40944: done dumping result, returning 28983 1726883079.40946: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affe814-3a2d-b16d-c0a7-0000000019cf] 28983 1726883079.40948: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000019cf 28983 1726883079.41226: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000019cf 28983 1726883079.41229: WORKER PROCESS EXITING ok: [managed_node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "state": "up" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": false } STDERR: [002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 2ca3cca4-edb7-40a1-9de5-195b63d4908d skipped because already active 28983 1726883079.41379: no more pending results, returning what we have 28983 1726883079.41383: results queue empty 28983 1726883079.41384: checking for any_errors_fatal 28983 1726883079.41392: done checking for any_errors_fatal 28983 1726883079.41393: checking for max_fail_percentage 28983 1726883079.41395: done checking for max_fail_percentage 28983 1726883079.41397: checking to see if all hosts have failed and the running result is not ok 28983 1726883079.41398: done checking to see if all hosts have failed 28983 1726883079.41399: getting the remaining hosts for this loop 28983 1726883079.41401: done getting the remaining hosts for this loop 28983 1726883079.41406: getting the next task for host managed_node2 28983 1726883079.41415: done getting next task for host managed_node2 28983 1726883079.41420: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 28983 1726883079.41426: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883079.41444: getting variables 28983 1726883079.41445: in VariableManager get_vars() 28983 1726883079.41490: Calling all_inventory to load vars for managed_node2 28983 1726883079.41494: Calling groups_inventory to load vars for managed_node2 28983 1726883079.41496: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883079.41506: Calling all_plugins_play to load vars for managed_node2 28983 1726883079.41510: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883079.41513: Calling groups_plugins_play to load vars for managed_node2 28983 1726883079.46496: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883079.53041: done with get_vars() 28983 1726883079.53130: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:44:39 -0400 (0:00:00.704) 0:01:49.531 ****** 28983 1726883079.53417: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 28983 1726883079.54133: worker is 1 (out of 1 available) 28983 1726883079.54589: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 28983 1726883079.54602: done queuing things up, now waiting for results queue to drain 28983 1726883079.54604: waiting for pending results... 28983 1726883079.55379: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 28983 1726883079.55658: in run() - task 0affe814-3a2d-b16d-c0a7-0000000019d0 28983 1726883079.56143: variable 'ansible_search_path' from source: unknown 28983 1726883079.56147: variable 'ansible_search_path' from source: unknown 28983 1726883079.56151: calling self._execute() 28983 1726883079.56256: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883079.56277: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883079.56297: variable 'omit' from source: magic vars 28983 1726883079.57279: variable 'ansible_distribution_major_version' from source: facts 28983 1726883079.57302: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883079.57708: variable 'network_state' from source: role '' defaults 28983 1726883079.57728: Evaluated conditional (network_state != {}): False 28983 1726883079.57740: when evaluation is False, skipping this task 28983 1726883079.57749: _execute() done 28983 1726883079.57759: dumping result to json 28983 1726883079.57769: done dumping result, returning 28983 1726883079.57789: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [0affe814-3a2d-b16d-c0a7-0000000019d0] 28983 1726883079.57802: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000019d0 skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28983 1726883079.58010: no more pending results, returning what we have 28983 1726883079.58015: results queue empty 28983 1726883079.58017: checking for any_errors_fatal 28983 1726883079.58032: done checking for any_errors_fatal 28983 1726883079.58036: checking for max_fail_percentage 28983 1726883079.58038: done checking for max_fail_percentage 28983 1726883079.58040: checking to see if all hosts have failed and the running result is not ok 28983 1726883079.58041: done checking to see if all hosts have failed 28983 1726883079.58042: getting the remaining hosts for this loop 28983 1726883079.58044: done getting the remaining hosts for this loop 28983 1726883079.58049: getting the next task for host managed_node2 28983 1726883079.58060: done getting next task for host managed_node2 28983 1726883079.58065: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 28983 1726883079.58073: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883079.58108: getting variables 28983 1726883079.58110: in VariableManager get_vars() 28983 1726883079.58266: Calling all_inventory to load vars for managed_node2 28983 1726883079.58270: Calling groups_inventory to load vars for managed_node2 28983 1726883079.58273: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883079.58286: Calling all_plugins_play to load vars for managed_node2 28983 1726883079.58290: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883079.58294: Calling groups_plugins_play to load vars for managed_node2 28983 1726883079.58951: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000019d0 28983 1726883079.58955: WORKER PROCESS EXITING 28983 1726883079.60788: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883079.63755: done with get_vars() 28983 1726883079.63813: done getting variables 28983 1726883079.63896: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:44:39 -0400 (0:00:00.105) 0:01:49.637 ****** 28983 1726883079.63947: entering _queue_task() for managed_node2/debug 28983 1726883079.64347: worker is 1 (out of 1 available) 28983 1726883079.64363: exiting _queue_task() for managed_node2/debug 28983 1726883079.64377: done queuing things up, now waiting for results queue to drain 28983 1726883079.64379: waiting for pending results... 28983 1726883079.64700: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 28983 1726883079.64890: in run() - task 0affe814-3a2d-b16d-c0a7-0000000019d1 28983 1726883079.64914: variable 'ansible_search_path' from source: unknown 28983 1726883079.64922: variable 'ansible_search_path' from source: unknown 28983 1726883079.64973: calling self._execute() 28983 1726883079.65094: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883079.65109: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883079.65127: variable 'omit' from source: magic vars 28983 1726883079.65578: variable 'ansible_distribution_major_version' from source: facts 28983 1726883079.65598: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883079.65610: variable 'omit' from source: magic vars 28983 1726883079.65702: variable 'omit' from source: magic vars 28983 1726883079.65754: variable 'omit' from source: magic vars 28983 1726883079.65805: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883079.65855: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883079.66039: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883079.66043: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883079.66045: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883079.66048: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883079.66050: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883079.66052: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883079.66095: Set connection var ansible_connection to ssh 28983 1726883079.66113: Set connection var ansible_shell_executable to /bin/sh 28983 1726883079.66128: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883079.66145: Set connection var ansible_timeout to 10 28983 1726883079.66158: Set connection var ansible_pipelining to False 28983 1726883079.66169: Set connection var ansible_shell_type to sh 28983 1726883079.66197: variable 'ansible_shell_executable' from source: unknown 28983 1726883079.66205: variable 'ansible_connection' from source: unknown 28983 1726883079.66211: variable 'ansible_module_compression' from source: unknown 28983 1726883079.66217: variable 'ansible_shell_type' from source: unknown 28983 1726883079.66223: variable 'ansible_shell_executable' from source: unknown 28983 1726883079.66228: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883079.66238: variable 'ansible_pipelining' from source: unknown 28983 1726883079.66244: variable 'ansible_timeout' from source: unknown 28983 1726883079.66251: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883079.66425: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883079.66448: variable 'omit' from source: magic vars 28983 1726883079.66458: starting attempt loop 28983 1726883079.66465: running the handler 28983 1726883079.66623: variable '__network_connections_result' from source: set_fact 28983 1726883079.66684: handler run complete 28983 1726883079.66716: attempt loop complete, returning result 28983 1726883079.66820: _execute() done 28983 1726883079.66824: dumping result to json 28983 1726883079.66826: done dumping result, returning 28983 1726883079.66829: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affe814-3a2d-b16d-c0a7-0000000019d1] 28983 1726883079.66831: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000019d1 28983 1726883079.66904: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000019d1 28983 1726883079.66907: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 2ca3cca4-edb7-40a1-9de5-195b63d4908d skipped because already active" ] } 28983 1726883079.67007: no more pending results, returning what we have 28983 1726883079.67011: results queue empty 28983 1726883079.67012: checking for any_errors_fatal 28983 1726883079.67022: done checking for any_errors_fatal 28983 1726883079.67023: checking for max_fail_percentage 28983 1726883079.67025: done checking for max_fail_percentage 28983 1726883079.67026: checking to see if all hosts have failed and the running result is not ok 28983 1726883079.67027: done checking to see if all hosts have failed 28983 1726883079.67028: getting the remaining hosts for this loop 28983 1726883079.67030: done getting the remaining hosts for this loop 28983 1726883079.67037: getting the next task for host managed_node2 28983 1726883079.67046: done getting next task for host managed_node2 28983 1726883079.67051: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 28983 1726883079.67058: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883079.67075: getting variables 28983 1726883079.67077: in VariableManager get_vars() 28983 1726883079.67127: Calling all_inventory to load vars for managed_node2 28983 1726883079.67130: Calling groups_inventory to load vars for managed_node2 28983 1726883079.67133: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883079.67349: Calling all_plugins_play to load vars for managed_node2 28983 1726883079.67353: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883079.67356: Calling groups_plugins_play to load vars for managed_node2 28983 1726883079.69915: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883079.72904: done with get_vars() 28983 1726883079.72944: done getting variables 28983 1726883079.73015: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:44:39 -0400 (0:00:00.091) 0:01:49.728 ****** 28983 1726883079.73065: entering _queue_task() for managed_node2/debug 28983 1726883079.73667: worker is 1 (out of 1 available) 28983 1726883079.73679: exiting _queue_task() for managed_node2/debug 28983 1726883079.73691: done queuing things up, now waiting for results queue to drain 28983 1726883079.73692: waiting for pending results... 28983 1726883079.73787: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 28983 1726883079.74028: in run() - task 0affe814-3a2d-b16d-c0a7-0000000019d2 28983 1726883079.74031: variable 'ansible_search_path' from source: unknown 28983 1726883079.74036: variable 'ansible_search_path' from source: unknown 28983 1726883079.74138: calling self._execute() 28983 1726883079.74192: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883079.74206: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883079.74222: variable 'omit' from source: magic vars 28983 1726883079.74674: variable 'ansible_distribution_major_version' from source: facts 28983 1726883079.74696: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883079.74707: variable 'omit' from source: magic vars 28983 1726883079.74794: variable 'omit' from source: magic vars 28983 1726883079.74842: variable 'omit' from source: magic vars 28983 1726883079.74897: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883079.75140: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883079.75143: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883079.75146: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883079.75149: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883079.75152: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883079.75154: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883079.75156: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883079.75179: Set connection var ansible_connection to ssh 28983 1726883079.75196: Set connection var ansible_shell_executable to /bin/sh 28983 1726883079.75212: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883079.75229: Set connection var ansible_timeout to 10 28983 1726883079.75243: Set connection var ansible_pipelining to False 28983 1726883079.75250: Set connection var ansible_shell_type to sh 28983 1726883079.75285: variable 'ansible_shell_executable' from source: unknown 28983 1726883079.75293: variable 'ansible_connection' from source: unknown 28983 1726883079.75301: variable 'ansible_module_compression' from source: unknown 28983 1726883079.75308: variable 'ansible_shell_type' from source: unknown 28983 1726883079.75315: variable 'ansible_shell_executable' from source: unknown 28983 1726883079.75322: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883079.75330: variable 'ansible_pipelining' from source: unknown 28983 1726883079.75339: variable 'ansible_timeout' from source: unknown 28983 1726883079.75348: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883079.75519: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883079.75541: variable 'omit' from source: magic vars 28983 1726883079.75552: starting attempt loop 28983 1726883079.75560: running the handler 28983 1726883079.75624: variable '__network_connections_result' from source: set_fact 28983 1726883079.75727: variable '__network_connections_result' from source: set_fact 28983 1726883079.75871: handler run complete 28983 1726883079.75910: attempt loop complete, returning result 28983 1726883079.75922: _execute() done 28983 1726883079.75930: dumping result to json 28983 1726883079.75942: done dumping result, returning 28983 1726883079.75955: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affe814-3a2d-b16d-c0a7-0000000019d2] 28983 1726883079.75965: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000019d2 ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "state": "up" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": false, "failed": false, "stderr": "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 2ca3cca4-edb7-40a1-9de5-195b63d4908d skipped because already active\n", "stderr_lines": [ "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 2ca3cca4-edb7-40a1-9de5-195b63d4908d skipped because already active" ] } } 28983 1726883079.76307: no more pending results, returning what we have 28983 1726883079.76311: results queue empty 28983 1726883079.76312: checking for any_errors_fatal 28983 1726883079.76321: done checking for any_errors_fatal 28983 1726883079.76322: checking for max_fail_percentage 28983 1726883079.76324: done checking for max_fail_percentage 28983 1726883079.76325: checking to see if all hosts have failed and the running result is not ok 28983 1726883079.76326: done checking to see if all hosts have failed 28983 1726883079.76327: getting the remaining hosts for this loop 28983 1726883079.76330: done getting the remaining hosts for this loop 28983 1726883079.76337: getting the next task for host managed_node2 28983 1726883079.76347: done getting next task for host managed_node2 28983 1726883079.76351: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 28983 1726883079.76357: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883079.76373: getting variables 28983 1726883079.76375: in VariableManager get_vars() 28983 1726883079.76425: Calling all_inventory to load vars for managed_node2 28983 1726883079.76428: Calling groups_inventory to load vars for managed_node2 28983 1726883079.76431: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883079.76439: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000019d2 28983 1726883079.76449: WORKER PROCESS EXITING 28983 1726883079.76644: Calling all_plugins_play to load vars for managed_node2 28983 1726883079.76648: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883079.76653: Calling groups_plugins_play to load vars for managed_node2 28983 1726883079.81151: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883079.85352: done with get_vars() 28983 1726883079.85390: done getting variables 28983 1726883079.85462: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:44:39 -0400 (0:00:00.124) 0:01:49.852 ****** 28983 1726883079.85504: entering _queue_task() for managed_node2/debug 28983 1726883079.85873: worker is 1 (out of 1 available) 28983 1726883079.85887: exiting _queue_task() for managed_node2/debug 28983 1726883079.85901: done queuing things up, now waiting for results queue to drain 28983 1726883079.85903: waiting for pending results... 28983 1726883079.86212: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 28983 1726883079.86409: in run() - task 0affe814-3a2d-b16d-c0a7-0000000019d3 28983 1726883079.86433: variable 'ansible_search_path' from source: unknown 28983 1726883079.86444: variable 'ansible_search_path' from source: unknown 28983 1726883079.86494: calling self._execute() 28983 1726883079.86613: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883079.86627: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883079.86646: variable 'omit' from source: magic vars 28983 1726883079.87091: variable 'ansible_distribution_major_version' from source: facts 28983 1726883079.87111: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883079.87281: variable 'network_state' from source: role '' defaults 28983 1726883079.87300: Evaluated conditional (network_state != {}): False 28983 1726883079.87540: when evaluation is False, skipping this task 28983 1726883079.87543: _execute() done 28983 1726883079.87546: dumping result to json 28983 1726883079.87548: done dumping result, returning 28983 1726883079.87551: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affe814-3a2d-b16d-c0a7-0000000019d3] 28983 1726883079.87553: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000019d3 28983 1726883079.87621: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000019d3 28983 1726883079.87624: WORKER PROCESS EXITING skipping: [managed_node2] => { "false_condition": "network_state != {}" } 28983 1726883079.87678: no more pending results, returning what we have 28983 1726883079.87683: results queue empty 28983 1726883079.87684: checking for any_errors_fatal 28983 1726883079.87693: done checking for any_errors_fatal 28983 1726883079.87694: checking for max_fail_percentage 28983 1726883079.87696: done checking for max_fail_percentage 28983 1726883079.87697: checking to see if all hosts have failed and the running result is not ok 28983 1726883079.87698: done checking to see if all hosts have failed 28983 1726883079.87699: getting the remaining hosts for this loop 28983 1726883079.87701: done getting the remaining hosts for this loop 28983 1726883079.87705: getting the next task for host managed_node2 28983 1726883079.87714: done getting next task for host managed_node2 28983 1726883079.87719: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 28983 1726883079.87725: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883079.87754: getting variables 28983 1726883079.87756: in VariableManager get_vars() 28983 1726883079.87801: Calling all_inventory to load vars for managed_node2 28983 1726883079.87804: Calling groups_inventory to load vars for managed_node2 28983 1726883079.87807: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883079.87818: Calling all_plugins_play to load vars for managed_node2 28983 1726883079.87822: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883079.87826: Calling groups_plugins_play to load vars for managed_node2 28983 1726883079.90114: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883079.93092: done with get_vars() 28983 1726883079.93127: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:44:39 -0400 (0:00:00.077) 0:01:49.930 ****** 28983 1726883079.93241: entering _queue_task() for managed_node2/ping 28983 1726883079.93552: worker is 1 (out of 1 available) 28983 1726883079.93564: exiting _queue_task() for managed_node2/ping 28983 1726883079.93578: done queuing things up, now waiting for results queue to drain 28983 1726883079.93580: waiting for pending results... 28983 1726883079.93889: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 28983 1726883079.94087: in run() - task 0affe814-3a2d-b16d-c0a7-0000000019d4 28983 1726883079.94109: variable 'ansible_search_path' from source: unknown 28983 1726883079.94118: variable 'ansible_search_path' from source: unknown 28983 1726883079.94241: calling self._execute() 28983 1726883079.94289: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883079.94300: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883079.94314: variable 'omit' from source: magic vars 28983 1726883079.94745: variable 'ansible_distribution_major_version' from source: facts 28983 1726883079.94765: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883079.94778: variable 'omit' from source: magic vars 28983 1726883079.94865: variable 'omit' from source: magic vars 28983 1726883079.94911: variable 'omit' from source: magic vars 28983 1726883079.95297: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883079.95301: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883079.95304: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883079.95306: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883079.95308: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883079.95344: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883079.95354: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883079.95363: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883079.95591: Set connection var ansible_connection to ssh 28983 1726883079.95841: Set connection var ansible_shell_executable to /bin/sh 28983 1726883079.95844: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883079.95847: Set connection var ansible_timeout to 10 28983 1726883079.95849: Set connection var ansible_pipelining to False 28983 1726883079.95851: Set connection var ansible_shell_type to sh 28983 1726883079.95853: variable 'ansible_shell_executable' from source: unknown 28983 1726883079.95855: variable 'ansible_connection' from source: unknown 28983 1726883079.95858: variable 'ansible_module_compression' from source: unknown 28983 1726883079.95860: variable 'ansible_shell_type' from source: unknown 28983 1726883079.95862: variable 'ansible_shell_executable' from source: unknown 28983 1726883079.95864: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883079.95866: variable 'ansible_pipelining' from source: unknown 28983 1726883079.95868: variable 'ansible_timeout' from source: unknown 28983 1726883079.95871: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883079.96371: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28983 1726883079.96554: variable 'omit' from source: magic vars 28983 1726883079.96732: starting attempt loop 28983 1726883079.96738: running the handler 28983 1726883079.96740: _low_level_execute_command(): starting 28983 1726883079.96743: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726883079.98201: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883079.98231: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883079.98351: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883079.98450: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883080.00223: stdout chunk (state=3): >>>/root <<< 28983 1726883080.00458: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883080.00525: stderr chunk (state=3): >>><<< 28983 1726883080.00583: stdout chunk (state=3): >>><<< 28983 1726883080.00611: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883080.00631: _low_level_execute_command(): starting 28983 1726883080.00694: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726883080.0061843-32995-101913915353694 `" && echo ansible-tmp-1726883080.0061843-32995-101913915353694="` echo /root/.ansible/tmp/ansible-tmp-1726883080.0061843-32995-101913915353694 `" ) && sleep 0' 28983 1726883080.01888: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883080.01892: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726883080.01895: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration <<< 28983 1726883080.01898: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found <<< 28983 1726883080.01907: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883080.02031: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883080.02163: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883080.02170: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883080.02247: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883080.04713: stdout chunk (state=3): >>>ansible-tmp-1726883080.0061843-32995-101913915353694=/root/.ansible/tmp/ansible-tmp-1726883080.0061843-32995-101913915353694 <<< 28983 1726883080.04717: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883080.04719: stdout chunk (state=3): >>><<< 28983 1726883080.04722: stderr chunk (state=3): >>><<< 28983 1726883080.04724: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726883080.0061843-32995-101913915353694=/root/.ansible/tmp/ansible-tmp-1726883080.0061843-32995-101913915353694 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883080.04727: variable 'ansible_module_compression' from source: unknown 28983 1726883080.04729: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 28983 1726883080.04731: variable 'ansible_facts' from source: unknown 28983 1726883080.04784: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726883080.0061843-32995-101913915353694/AnsiballZ_ping.py 28983 1726883080.05168: Sending initial data 28983 1726883080.05184: Sent initial data (153 bytes) 28983 1726883080.06680: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883080.06819: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883080.06901: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883080.06917: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883080.07263: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883080.08972: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726883080.09065: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726883080.09126: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpja9tm2vt /root/.ansible/tmp/ansible-tmp-1726883080.0061843-32995-101913915353694/AnsiballZ_ping.py <<< 28983 1726883080.09151: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726883080.0061843-32995-101913915353694/AnsiballZ_ping.py" <<< 28983 1726883080.09220: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpja9tm2vt" to remote "/root/.ansible/tmp/ansible-tmp-1726883080.0061843-32995-101913915353694/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726883080.0061843-32995-101913915353694/AnsiballZ_ping.py" <<< 28983 1726883080.10410: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883080.10440: stderr chunk (state=3): >>><<< 28983 1726883080.10458: stdout chunk (state=3): >>><<< 28983 1726883080.10573: done transferring module to remote 28983 1726883080.10577: _low_level_execute_command(): starting 28983 1726883080.10579: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726883080.0061843-32995-101913915353694/ /root/.ansible/tmp/ansible-tmp-1726883080.0061843-32995-101913915353694/AnsiballZ_ping.py && sleep 0' 28983 1726883080.11359: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883080.11484: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 28983 1726883080.11506: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883080.11524: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883080.11650: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883080.13610: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883080.13633: stdout chunk (state=3): >>><<< 28983 1726883080.13648: stderr chunk (state=3): >>><<< 28983 1726883080.13673: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883080.13684: _low_level_execute_command(): starting 28983 1726883080.13695: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726883080.0061843-32995-101913915353694/AnsiballZ_ping.py && sleep 0' 28983 1726883080.14347: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883080.14677: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883080.14681: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883080.14706: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883080.14725: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883080.14785: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883080.15370: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883080.32436: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 28983 1726883080.33890: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. <<< 28983 1726883080.33940: stderr chunk (state=3): >>><<< 28983 1726883080.33944: stdout chunk (state=3): >>><<< 28983 1726883080.33961: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726883080.33986: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726883080.0061843-32995-101913915353694/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726883080.34000: _low_level_execute_command(): starting 28983 1726883080.34007: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726883080.0061843-32995-101913915353694/ > /dev/null 2>&1 && sleep 0' 28983 1726883080.34418: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883080.34458: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726883080.34462: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883080.34464: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883080.34467: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883080.34514: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883080.34518: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883080.34593: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883080.36556: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883080.36678: stderr chunk (state=3): >>><<< 28983 1726883080.36681: stdout chunk (state=3): >>><<< 28983 1726883080.36684: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883080.36691: handler run complete 28983 1726883080.36694: attempt loop complete, returning result 28983 1726883080.36696: _execute() done 28983 1726883080.36698: dumping result to json 28983 1726883080.36700: done dumping result, returning 28983 1726883080.36702: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affe814-3a2d-b16d-c0a7-0000000019d4] 28983 1726883080.36705: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000019d4 28983 1726883080.36928: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000019d4 28983 1726883080.36934: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "ping": "pong" } 28983 1726883080.37041: no more pending results, returning what we have 28983 1726883080.37045: results queue empty 28983 1726883080.37046: checking for any_errors_fatal 28983 1726883080.37055: done checking for any_errors_fatal 28983 1726883080.37055: checking for max_fail_percentage 28983 1726883080.37058: done checking for max_fail_percentage 28983 1726883080.37059: checking to see if all hosts have failed and the running result is not ok 28983 1726883080.37060: done checking to see if all hosts have failed 28983 1726883080.37060: getting the remaining hosts for this loop 28983 1726883080.37062: done getting the remaining hosts for this loop 28983 1726883080.37067: getting the next task for host managed_node2 28983 1726883080.37081: done getting next task for host managed_node2 28983 1726883080.37084: ^ task is: TASK: meta (role_complete) 28983 1726883080.37090: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883080.37105: getting variables 28983 1726883080.37106: in VariableManager get_vars() 28983 1726883080.37165: Calling all_inventory to load vars for managed_node2 28983 1726883080.37168: Calling groups_inventory to load vars for managed_node2 28983 1726883080.37173: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883080.37183: Calling all_plugins_play to load vars for managed_node2 28983 1726883080.37187: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883080.37190: Calling groups_plugins_play to load vars for managed_node2 28983 1726883080.43703: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883080.46066: done with get_vars() 28983 1726883080.46104: done getting variables 28983 1726883080.46191: done queuing things up, now waiting for results queue to drain 28983 1726883080.46193: results queue empty 28983 1726883080.46194: checking for any_errors_fatal 28983 1726883080.46198: done checking for any_errors_fatal 28983 1726883080.46199: checking for max_fail_percentage 28983 1726883080.46200: done checking for max_fail_percentage 28983 1726883080.46201: checking to see if all hosts have failed and the running result is not ok 28983 1726883080.46202: done checking to see if all hosts have failed 28983 1726883080.46203: getting the remaining hosts for this loop 28983 1726883080.46204: done getting the remaining hosts for this loop 28983 1726883080.46207: getting the next task for host managed_node2 28983 1726883080.46215: done getting next task for host managed_node2 28983 1726883080.46217: ^ task is: TASK: Include network role 28983 1726883080.46220: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883080.46223: getting variables 28983 1726883080.46225: in VariableManager get_vars() 28983 1726883080.46240: Calling all_inventory to load vars for managed_node2 28983 1726883080.46243: Calling groups_inventory to load vars for managed_node2 28983 1726883080.46246: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883080.46252: Calling all_plugins_play to load vars for managed_node2 28983 1726883080.46256: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883080.46261: Calling groups_plugins_play to load vars for managed_node2 28983 1726883080.48263: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883080.51400: done with get_vars() 28983 1726883080.51433: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_profile.yml:3 Friday 20 September 2024 21:44:40 -0400 (0:00:00.582) 0:01:50.513 ****** 28983 1726883080.51528: entering _queue_task() for managed_node2/include_role 28983 1726883080.51918: worker is 1 (out of 1 available) 28983 1726883080.51932: exiting _queue_task() for managed_node2/include_role 28983 1726883080.51947: done queuing things up, now waiting for results queue to drain 28983 1726883080.51950: waiting for pending results... 28983 1726883080.52358: running TaskExecutor() for managed_node2/TASK: Include network role 28983 1726883080.52475: in run() - task 0affe814-3a2d-b16d-c0a7-0000000017d9 28983 1726883080.52541: variable 'ansible_search_path' from source: unknown 28983 1726883080.52545: variable 'ansible_search_path' from source: unknown 28983 1726883080.52558: calling self._execute() 28983 1726883080.52740: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883080.52744: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883080.52748: variable 'omit' from source: magic vars 28983 1726883080.53195: variable 'ansible_distribution_major_version' from source: facts 28983 1726883080.53217: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883080.53228: _execute() done 28983 1726883080.53239: dumping result to json 28983 1726883080.53248: done dumping result, returning 28983 1726883080.53259: done running TaskExecutor() for managed_node2/TASK: Include network role [0affe814-3a2d-b16d-c0a7-0000000017d9] 28983 1726883080.53269: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000017d9 28983 1726883080.53453: no more pending results, returning what we have 28983 1726883080.53460: in VariableManager get_vars() 28983 1726883080.53516: Calling all_inventory to load vars for managed_node2 28983 1726883080.53520: Calling groups_inventory to load vars for managed_node2 28983 1726883080.53524: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883080.53541: Calling all_plugins_play to load vars for managed_node2 28983 1726883080.53546: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883080.53550: Calling groups_plugins_play to load vars for managed_node2 28983 1726883080.54374: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000017d9 28983 1726883080.54377: WORKER PROCESS EXITING 28983 1726883080.56097: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883080.59290: done with get_vars() 28983 1726883080.59325: variable 'ansible_search_path' from source: unknown 28983 1726883080.59326: variable 'ansible_search_path' from source: unknown 28983 1726883080.59533: variable 'omit' from source: magic vars 28983 1726883080.59595: variable 'omit' from source: magic vars 28983 1726883080.59616: variable 'omit' from source: magic vars 28983 1726883080.59620: we have included files to process 28983 1726883080.59621: generating all_blocks data 28983 1726883080.59624: done generating all_blocks data 28983 1726883080.59629: processing included file: fedora.linux_system_roles.network 28983 1726883080.59658: in VariableManager get_vars() 28983 1726883080.59679: done with get_vars() 28983 1726883080.59714: in VariableManager get_vars() 28983 1726883080.59737: done with get_vars() 28983 1726883080.59789: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 28983 1726883080.59964: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 28983 1726883080.60083: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 28983 1726883080.60747: in VariableManager get_vars() 28983 1726883080.60781: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 28983 1726883080.65868: iterating over new_blocks loaded from include file 28983 1726883080.65870: in VariableManager get_vars() 28983 1726883080.65895: done with get_vars() 28983 1726883080.65898: filtering new block on tags 28983 1726883080.66407: done filtering new block on tags 28983 1726883080.66411: in VariableManager get_vars() 28983 1726883080.66431: done with get_vars() 28983 1726883080.66433: filtering new block on tags 28983 1726883080.66456: done filtering new block on tags 28983 1726883080.66459: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node2 28983 1726883080.66470: extending task lists for all hosts with included blocks 28983 1726883080.66635: done extending task lists 28983 1726883080.66637: done processing included files 28983 1726883080.66638: results queue empty 28983 1726883080.66639: checking for any_errors_fatal 28983 1726883080.66641: done checking for any_errors_fatal 28983 1726883080.66642: checking for max_fail_percentage 28983 1726883080.66644: done checking for max_fail_percentage 28983 1726883080.66644: checking to see if all hosts have failed and the running result is not ok 28983 1726883080.66646: done checking to see if all hosts have failed 28983 1726883080.66646: getting the remaining hosts for this loop 28983 1726883080.66648: done getting the remaining hosts for this loop 28983 1726883080.66652: getting the next task for host managed_node2 28983 1726883080.66657: done getting next task for host managed_node2 28983 1726883080.66660: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 28983 1726883080.66664: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883080.66680: getting variables 28983 1726883080.66687: in VariableManager get_vars() 28983 1726883080.66704: Calling all_inventory to load vars for managed_node2 28983 1726883080.66707: Calling groups_inventory to load vars for managed_node2 28983 1726883080.66710: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883080.66716: Calling all_plugins_play to load vars for managed_node2 28983 1726883080.66719: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883080.66723: Calling groups_plugins_play to load vars for managed_node2 28983 1726883080.68849: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883080.72014: done with get_vars() 28983 1726883080.72050: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:44:40 -0400 (0:00:00.206) 0:01:50.719 ****** 28983 1726883080.72147: entering _queue_task() for managed_node2/include_tasks 28983 1726883080.72529: worker is 1 (out of 1 available) 28983 1726883080.72543: exiting _queue_task() for managed_node2/include_tasks 28983 1726883080.72557: done queuing things up, now waiting for results queue to drain 28983 1726883080.72559: waiting for pending results... 28983 1726883080.72953: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 28983 1726883080.73091: in run() - task 0affe814-3a2d-b16d-c0a7-000000001b3b 28983 1726883080.73117: variable 'ansible_search_path' from source: unknown 28983 1726883080.73126: variable 'ansible_search_path' from source: unknown 28983 1726883080.73178: calling self._execute() 28983 1726883080.73306: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883080.73399: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883080.73405: variable 'omit' from source: magic vars 28983 1726883080.73834: variable 'ansible_distribution_major_version' from source: facts 28983 1726883080.73857: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883080.73868: _execute() done 28983 1726883080.73880: dumping result to json 28983 1726883080.73889: done dumping result, returning 28983 1726883080.73902: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affe814-3a2d-b16d-c0a7-000000001b3b] 28983 1726883080.73913: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001b3b 28983 1726883080.74120: no more pending results, returning what we have 28983 1726883080.74126: in VariableManager get_vars() 28983 1726883080.74188: Calling all_inventory to load vars for managed_node2 28983 1726883080.74191: Calling groups_inventory to load vars for managed_node2 28983 1726883080.74194: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883080.74207: Calling all_plugins_play to load vars for managed_node2 28983 1726883080.74211: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883080.74215: Calling groups_plugins_play to load vars for managed_node2 28983 1726883080.74851: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001b3b 28983 1726883080.74854: WORKER PROCESS EXITING 28983 1726883080.77020: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883080.80286: done with get_vars() 28983 1726883080.80323: variable 'ansible_search_path' from source: unknown 28983 1726883080.80325: variable 'ansible_search_path' from source: unknown 28983 1726883080.80374: we have included files to process 28983 1726883080.80376: generating all_blocks data 28983 1726883080.80378: done generating all_blocks data 28983 1726883080.80382: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 28983 1726883080.80384: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 28983 1726883080.80386: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 28983 1726883080.81155: done processing included file 28983 1726883080.81158: iterating over new_blocks loaded from include file 28983 1726883080.81160: in VariableManager get_vars() 28983 1726883080.81197: done with get_vars() 28983 1726883080.81200: filtering new block on tags 28983 1726883080.81242: done filtering new block on tags 28983 1726883080.81246: in VariableManager get_vars() 28983 1726883080.81277: done with get_vars() 28983 1726883080.81279: filtering new block on tags 28983 1726883080.81346: done filtering new block on tags 28983 1726883080.81350: in VariableManager get_vars() 28983 1726883080.81383: done with get_vars() 28983 1726883080.81385: filtering new block on tags 28983 1726883080.81449: done filtering new block on tags 28983 1726883080.81451: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node2 28983 1726883080.81458: extending task lists for all hosts with included blocks 28983 1726883080.83985: done extending task lists 28983 1726883080.83987: done processing included files 28983 1726883080.83988: results queue empty 28983 1726883080.83989: checking for any_errors_fatal 28983 1726883080.83994: done checking for any_errors_fatal 28983 1726883080.83995: checking for max_fail_percentage 28983 1726883080.83997: done checking for max_fail_percentage 28983 1726883080.83998: checking to see if all hosts have failed and the running result is not ok 28983 1726883080.83999: done checking to see if all hosts have failed 28983 1726883080.84000: getting the remaining hosts for this loop 28983 1726883080.84002: done getting the remaining hosts for this loop 28983 1726883080.84006: getting the next task for host managed_node2 28983 1726883080.84012: done getting next task for host managed_node2 28983 1726883080.84015: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 28983 1726883080.84021: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883080.84038: getting variables 28983 1726883080.84040: in VariableManager get_vars() 28983 1726883080.84064: Calling all_inventory to load vars for managed_node2 28983 1726883080.84067: Calling groups_inventory to load vars for managed_node2 28983 1726883080.84070: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883080.84079: Calling all_plugins_play to load vars for managed_node2 28983 1726883080.84083: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883080.84087: Calling groups_plugins_play to load vars for managed_node2 28983 1726883080.86294: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883080.89458: done with get_vars() 28983 1726883080.89497: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:44:40 -0400 (0:00:00.174) 0:01:50.893 ****** 28983 1726883080.89603: entering _queue_task() for managed_node2/setup 28983 1726883080.90022: worker is 1 (out of 1 available) 28983 1726883080.90040: exiting _queue_task() for managed_node2/setup 28983 1726883080.90055: done queuing things up, now waiting for results queue to drain 28983 1726883080.90058: waiting for pending results... 28983 1726883080.90352: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 28983 1726883080.90539: in run() - task 0affe814-3a2d-b16d-c0a7-000000001b92 28983 1726883080.90557: variable 'ansible_search_path' from source: unknown 28983 1726883080.90561: variable 'ansible_search_path' from source: unknown 28983 1726883080.90600: calling self._execute() 28983 1726883080.90708: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883080.90717: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883080.90729: variable 'omit' from source: magic vars 28983 1726883080.91173: variable 'ansible_distribution_major_version' from source: facts 28983 1726883080.91185: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883080.91604: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883080.94558: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883080.94657: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883080.94710: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883080.94767: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883080.94809: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883080.94913: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883080.94981: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883080.95003: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883080.95062: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883080.95091: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883080.95196: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883080.95203: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883080.95242: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883080.95304: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883080.95329: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883080.95633: variable '__network_required_facts' from source: role '' defaults 28983 1726883080.95641: variable 'ansible_facts' from source: unknown 28983 1726883080.96841: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 28983 1726883080.96851: when evaluation is False, skipping this task 28983 1726883080.96859: _execute() done 28983 1726883080.96867: dumping result to json 28983 1726883080.96878: done dumping result, returning 28983 1726883080.96891: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affe814-3a2d-b16d-c0a7-000000001b92] 28983 1726883080.96902: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001b92 skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28983 1726883080.97094: no more pending results, returning what we have 28983 1726883080.97098: results queue empty 28983 1726883080.97099: checking for any_errors_fatal 28983 1726883080.97101: done checking for any_errors_fatal 28983 1726883080.97102: checking for max_fail_percentage 28983 1726883080.97104: done checking for max_fail_percentage 28983 1726883080.97105: checking to see if all hosts have failed and the running result is not ok 28983 1726883080.97105: done checking to see if all hosts have failed 28983 1726883080.97106: getting the remaining hosts for this loop 28983 1726883080.97108: done getting the remaining hosts for this loop 28983 1726883080.97113: getting the next task for host managed_node2 28983 1726883080.97126: done getting next task for host managed_node2 28983 1726883080.97131: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 28983 1726883080.97140: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883080.97176: getting variables 28983 1726883080.97178: in VariableManager get_vars() 28983 1726883080.97228: Calling all_inventory to load vars for managed_node2 28983 1726883080.97231: Calling groups_inventory to load vars for managed_node2 28983 1726883080.97437: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883080.97450: Calling all_plugins_play to load vars for managed_node2 28983 1726883080.97454: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883080.97459: Calling groups_plugins_play to load vars for managed_node2 28983 1726883080.97981: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001b92 28983 1726883080.97990: WORKER PROCESS EXITING 28983 1726883080.99794: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883081.03007: done with get_vars() 28983 1726883081.03046: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:44:41 -0400 (0:00:00.135) 0:01:51.029 ****** 28983 1726883081.03176: entering _queue_task() for managed_node2/stat 28983 1726883081.03539: worker is 1 (out of 1 available) 28983 1726883081.03665: exiting _queue_task() for managed_node2/stat 28983 1726883081.03681: done queuing things up, now waiting for results queue to drain 28983 1726883081.03683: waiting for pending results... 28983 1726883081.04003: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 28983 1726883081.04155: in run() - task 0affe814-3a2d-b16d-c0a7-000000001b94 28983 1726883081.04181: variable 'ansible_search_path' from source: unknown 28983 1726883081.04189: variable 'ansible_search_path' from source: unknown 28983 1726883081.04245: calling self._execute() 28983 1726883081.04367: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883081.04385: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883081.04401: variable 'omit' from source: magic vars 28983 1726883081.04881: variable 'ansible_distribution_major_version' from source: facts 28983 1726883081.04940: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883081.05113: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883081.05483: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883081.05554: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883081.05603: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883081.05739: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883081.05776: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883081.05816: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883081.05868: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883081.05912: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883081.06038: variable '__network_is_ostree' from source: set_fact 28983 1726883081.06058: Evaluated conditional (not __network_is_ostree is defined): False 28983 1726883081.06075: when evaluation is False, skipping this task 28983 1726883081.06085: _execute() done 28983 1726883081.06094: dumping result to json 28983 1726883081.06103: done dumping result, returning 28983 1726883081.06117: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affe814-3a2d-b16d-c0a7-000000001b94] 28983 1726883081.06129: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001b94 28983 1726883081.06358: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001b94 28983 1726883081.06361: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 28983 1726883081.06430: no more pending results, returning what we have 28983 1726883081.06437: results queue empty 28983 1726883081.06438: checking for any_errors_fatal 28983 1726883081.06449: done checking for any_errors_fatal 28983 1726883081.06451: checking for max_fail_percentage 28983 1726883081.06453: done checking for max_fail_percentage 28983 1726883081.06454: checking to see if all hosts have failed and the running result is not ok 28983 1726883081.06455: done checking to see if all hosts have failed 28983 1726883081.06456: getting the remaining hosts for this loop 28983 1726883081.06458: done getting the remaining hosts for this loop 28983 1726883081.06464: getting the next task for host managed_node2 28983 1726883081.06478: done getting next task for host managed_node2 28983 1726883081.06484: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 28983 1726883081.06492: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883081.06527: getting variables 28983 1726883081.06529: in VariableManager get_vars() 28983 1726883081.06692: Calling all_inventory to load vars for managed_node2 28983 1726883081.06696: Calling groups_inventory to load vars for managed_node2 28983 1726883081.06699: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883081.06710: Calling all_plugins_play to load vars for managed_node2 28983 1726883081.06714: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883081.06718: Calling groups_plugins_play to load vars for managed_node2 28983 1726883081.09439: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883081.12608: done with get_vars() 28983 1726883081.12651: done getting variables 28983 1726883081.12722: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:44:41 -0400 (0:00:00.095) 0:01:51.125 ****** 28983 1726883081.12777: entering _queue_task() for managed_node2/set_fact 28983 1726883081.13149: worker is 1 (out of 1 available) 28983 1726883081.13340: exiting _queue_task() for managed_node2/set_fact 28983 1726883081.13353: done queuing things up, now waiting for results queue to drain 28983 1726883081.13355: waiting for pending results... 28983 1726883081.13704: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 28983 1726883081.13749: in run() - task 0affe814-3a2d-b16d-c0a7-000000001b95 28983 1726883081.13776: variable 'ansible_search_path' from source: unknown 28983 1726883081.13787: variable 'ansible_search_path' from source: unknown 28983 1726883081.13840: calling self._execute() 28983 1726883081.13965: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883081.13981: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883081.13998: variable 'omit' from source: magic vars 28983 1726883081.14486: variable 'ansible_distribution_major_version' from source: facts 28983 1726883081.14505: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883081.14732: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883081.15086: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883081.15153: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883081.15218: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883081.15250: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883081.15364: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883081.15436: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883081.15444: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883081.15484: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883081.15602: variable '__network_is_ostree' from source: set_fact 28983 1726883081.15615: Evaluated conditional (not __network_is_ostree is defined): False 28983 1726883081.15653: when evaluation is False, skipping this task 28983 1726883081.15656: _execute() done 28983 1726883081.15658: dumping result to json 28983 1726883081.15660: done dumping result, returning 28983 1726883081.15663: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affe814-3a2d-b16d-c0a7-000000001b95] 28983 1726883081.15674: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001b95 skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 28983 1726883081.15841: no more pending results, returning what we have 28983 1726883081.15846: results queue empty 28983 1726883081.15847: checking for any_errors_fatal 28983 1726883081.15856: done checking for any_errors_fatal 28983 1726883081.15857: checking for max_fail_percentage 28983 1726883081.15859: done checking for max_fail_percentage 28983 1726883081.15861: checking to see if all hosts have failed and the running result is not ok 28983 1726883081.15862: done checking to see if all hosts have failed 28983 1726883081.15863: getting the remaining hosts for this loop 28983 1726883081.15865: done getting the remaining hosts for this loop 28983 1726883081.15870: getting the next task for host managed_node2 28983 1726883081.15887: done getting next task for host managed_node2 28983 1726883081.15894: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 28983 1726883081.15901: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883081.15937: getting variables 28983 1726883081.15939: in VariableManager get_vars() 28983 1726883081.15991: Calling all_inventory to load vars for managed_node2 28983 1726883081.15995: Calling groups_inventory to load vars for managed_node2 28983 1726883081.15998: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883081.16008: Calling all_plugins_play to load vars for managed_node2 28983 1726883081.16012: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883081.16016: Calling groups_plugins_play to load vars for managed_node2 28983 1726883081.16760: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001b95 28983 1726883081.16763: WORKER PROCESS EXITING 28983 1726883081.18881: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883081.21789: done with get_vars() 28983 1726883081.21818: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:44:41 -0400 (0:00:00.091) 0:01:51.216 ****** 28983 1726883081.21902: entering _queue_task() for managed_node2/service_facts 28983 1726883081.22179: worker is 1 (out of 1 available) 28983 1726883081.22193: exiting _queue_task() for managed_node2/service_facts 28983 1726883081.22208: done queuing things up, now waiting for results queue to drain 28983 1726883081.22210: waiting for pending results... 28983 1726883081.22408: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running 28983 1726883081.22536: in run() - task 0affe814-3a2d-b16d-c0a7-000000001b97 28983 1726883081.22558: variable 'ansible_search_path' from source: unknown 28983 1726883081.22563: variable 'ansible_search_path' from source: unknown 28983 1726883081.22588: calling self._execute() 28983 1726883081.22677: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883081.22682: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883081.22693: variable 'omit' from source: magic vars 28983 1726883081.23014: variable 'ansible_distribution_major_version' from source: facts 28983 1726883081.23024: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883081.23031: variable 'omit' from source: magic vars 28983 1726883081.23120: variable 'omit' from source: magic vars 28983 1726883081.23150: variable 'omit' from source: magic vars 28983 1726883081.23277: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883081.23283: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883081.23286: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883081.23288: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883081.23330: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883081.23335: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883081.23343: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883081.23470: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883081.23476: Set connection var ansible_connection to ssh 28983 1726883081.23500: Set connection var ansible_shell_executable to /bin/sh 28983 1726883081.23521: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883081.23540: Set connection var ansible_timeout to 10 28983 1726883081.23554: Set connection var ansible_pipelining to False 28983 1726883081.23646: Set connection var ansible_shell_type to sh 28983 1726883081.23650: variable 'ansible_shell_executable' from source: unknown 28983 1726883081.23652: variable 'ansible_connection' from source: unknown 28983 1726883081.23654: variable 'ansible_module_compression' from source: unknown 28983 1726883081.23656: variable 'ansible_shell_type' from source: unknown 28983 1726883081.23659: variable 'ansible_shell_executable' from source: unknown 28983 1726883081.23660: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883081.23663: variable 'ansible_pipelining' from source: unknown 28983 1726883081.23665: variable 'ansible_timeout' from source: unknown 28983 1726883081.23667: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883081.24013: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28983 1726883081.24018: variable 'omit' from source: magic vars 28983 1726883081.24021: starting attempt loop 28983 1726883081.24023: running the handler 28983 1726883081.24025: _low_level_execute_command(): starting 28983 1726883081.24027: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726883081.24587: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883081.24599: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883081.24613: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883081.24630: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883081.24693: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883081.24697: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883081.24779: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883081.26568: stdout chunk (state=3): >>>/root <<< 28983 1726883081.26756: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883081.26759: stdout chunk (state=3): >>><<< 28983 1726883081.26761: stderr chunk (state=3): >>><<< 28983 1726883081.26842: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883081.26846: _low_level_execute_command(): starting 28983 1726883081.26850: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726883081.2678797-33041-209443729805677 `" && echo ansible-tmp-1726883081.2678797-33041-209443729805677="` echo /root/.ansible/tmp/ansible-tmp-1726883081.2678797-33041-209443729805677 `" ) && sleep 0' 28983 1726883081.27469: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883081.27485: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883081.27584: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883081.27588: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883081.27633: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883081.27703: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883081.29753: stdout chunk (state=3): >>>ansible-tmp-1726883081.2678797-33041-209443729805677=/root/.ansible/tmp/ansible-tmp-1726883081.2678797-33041-209443729805677 <<< 28983 1726883081.29954: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883081.29959: stdout chunk (state=3): >>><<< 28983 1726883081.29962: stderr chunk (state=3): >>><<< 28983 1726883081.30040: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726883081.2678797-33041-209443729805677=/root/.ansible/tmp/ansible-tmp-1726883081.2678797-33041-209443729805677 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883081.30045: variable 'ansible_module_compression' from source: unknown 28983 1726883081.30106: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 28983 1726883081.30148: variable 'ansible_facts' from source: unknown 28983 1726883081.30265: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726883081.2678797-33041-209443729805677/AnsiballZ_service_facts.py 28983 1726883081.30399: Sending initial data 28983 1726883081.30404: Sent initial data (162 bytes) 28983 1726883081.31154: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883081.31158: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883081.31161: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883081.31199: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883081.31310: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883081.33024: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 28983 1726883081.33029: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726883081.33091: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726883081.33164: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmp2hc6r3gf /root/.ansible/tmp/ansible-tmp-1726883081.2678797-33041-209443729805677/AnsiballZ_service_facts.py <<< 28983 1726883081.33167: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726883081.2678797-33041-209443729805677/AnsiballZ_service_facts.py" <<< 28983 1726883081.33231: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmp2hc6r3gf" to remote "/root/.ansible/tmp/ansible-tmp-1726883081.2678797-33041-209443729805677/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726883081.2678797-33041-209443729805677/AnsiballZ_service_facts.py" <<< 28983 1726883081.34594: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883081.34651: stderr chunk (state=3): >>><<< 28983 1726883081.34655: stdout chunk (state=3): >>><<< 28983 1726883081.34672: done transferring module to remote 28983 1726883081.34683: _low_level_execute_command(): starting 28983 1726883081.34689: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726883081.2678797-33041-209443729805677/ /root/.ansible/tmp/ansible-tmp-1726883081.2678797-33041-209443729805677/AnsiballZ_service_facts.py && sleep 0' 28983 1726883081.35119: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883081.35122: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883081.35125: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found <<< 28983 1726883081.35128: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883081.35181: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883081.35184: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883081.35252: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883081.37190: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883081.37235: stderr chunk (state=3): >>><<< 28983 1726883081.37239: stdout chunk (state=3): >>><<< 28983 1726883081.37251: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883081.37255: _low_level_execute_command(): starting 28983 1726883081.37261: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726883081.2678797-33041-209443729805677/AnsiballZ_service_facts.py && sleep 0' 28983 1726883081.37684: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883081.37687: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883081.37690: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883081.37692: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883081.37740: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883081.37749: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883081.37827: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883083.36341: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"network": {"name": "network", "state": "running", "status": "enabled", "source": "sysv"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "import-state.service": {"name": "import-state.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "ip6tables.service": {"name": "ip6tables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "iptables.service": {"name": "iptables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service<<< 28983 1726883083.36413: stdout chunk (state=3): >>>", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "generated", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "sour<<< 28983 1726883083.36462: stdout chunk (state=3): >>>ce": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "loadmodules.service": {"name": "loadmodules.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 28983 1726883083.38029: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. <<< 28983 1726883083.38084: stderr chunk (state=3): >>><<< 28983 1726883083.38087: stdout chunk (state=3): >>><<< 28983 1726883083.38114: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"network": {"name": "network", "state": "running", "status": "enabled", "source": "sysv"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "import-state.service": {"name": "import-state.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "ip6tables.service": {"name": "ip6tables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "iptables.service": {"name": "iptables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "generated", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "loadmodules.service": {"name": "loadmodules.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726883083.38838: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726883081.2678797-33041-209443729805677/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726883083.38847: _low_level_execute_command(): starting 28983 1726883083.38854: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726883081.2678797-33041-209443729805677/ > /dev/null 2>&1 && sleep 0' 28983 1726883083.39480: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883083.39484: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883083.39540: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883083.39544: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883083.39546: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726883083.39549: stderr chunk (state=3): >>>debug2: match not found <<< 28983 1726883083.39551: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883083.39554: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28983 1726883083.39569: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.46.139 is address <<< 28983 1726883083.39572: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28983 1726883083.39588: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883083.39596: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883083.39610: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883083.39619: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726883083.39628: stderr chunk (state=3): >>>debug2: match found <<< 28983 1726883083.39650: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883083.39725: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883083.39739: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883083.39756: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883083.40051: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883083.41866: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883083.41869: stdout chunk (state=3): >>><<< 28983 1726883083.41880: stderr chunk (state=3): >>><<< 28983 1726883083.41944: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883083.41947: handler run complete 28983 1726883083.42201: variable 'ansible_facts' from source: unknown 28983 1726883083.42540: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883083.43347: variable 'ansible_facts' from source: unknown 28983 1726883083.43480: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883083.43836: attempt loop complete, returning result 28983 1726883083.43846: _execute() done 28983 1726883083.43849: dumping result to json 28983 1726883083.43936: done dumping result, returning 28983 1726883083.43963: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running [0affe814-3a2d-b16d-c0a7-000000001b97] 28983 1726883083.43967: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001b97 28983 1726883083.45905: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001b97 28983 1726883083.45908: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28983 1726883083.46042: no more pending results, returning what we have 28983 1726883083.46046: results queue empty 28983 1726883083.46047: checking for any_errors_fatal 28983 1726883083.46052: done checking for any_errors_fatal 28983 1726883083.46053: checking for max_fail_percentage 28983 1726883083.46055: done checking for max_fail_percentage 28983 1726883083.46056: checking to see if all hosts have failed and the running result is not ok 28983 1726883083.46057: done checking to see if all hosts have failed 28983 1726883083.46058: getting the remaining hosts for this loop 28983 1726883083.46060: done getting the remaining hosts for this loop 28983 1726883083.46064: getting the next task for host managed_node2 28983 1726883083.46073: done getting next task for host managed_node2 28983 1726883083.46077: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 28983 1726883083.46085: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883083.46104: getting variables 28983 1726883083.46106: in VariableManager get_vars() 28983 1726883083.46261: Calling all_inventory to load vars for managed_node2 28983 1726883083.46264: Calling groups_inventory to load vars for managed_node2 28983 1726883083.46267: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883083.46277: Calling all_plugins_play to load vars for managed_node2 28983 1726883083.46281: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883083.46285: Calling groups_plugins_play to load vars for managed_node2 28983 1726883083.51158: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883083.57404: done with get_vars() 28983 1726883083.57467: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:44:43 -0400 (0:00:02.358) 0:01:53.575 ****** 28983 1726883083.57805: entering _queue_task() for managed_node2/package_facts 28983 1726883083.58622: worker is 1 (out of 1 available) 28983 1726883083.58637: exiting _queue_task() for managed_node2/package_facts 28983 1726883083.58654: done queuing things up, now waiting for results queue to drain 28983 1726883083.58656: waiting for pending results... 28983 1726883083.59224: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 28983 1726883083.59561: in run() - task 0affe814-3a2d-b16d-c0a7-000000001b98 28983 1726883083.59578: variable 'ansible_search_path' from source: unknown 28983 1726883083.59582: variable 'ansible_search_path' from source: unknown 28983 1726883083.59668: calling self._execute() 28983 1726883083.59940: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883083.59946: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883083.59961: variable 'omit' from source: magic vars 28983 1726883083.60815: variable 'ansible_distribution_major_version' from source: facts 28983 1726883083.60826: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883083.60835: variable 'omit' from source: magic vars 28983 1726883083.61241: variable 'omit' from source: magic vars 28983 1726883083.61245: variable 'omit' from source: magic vars 28983 1726883083.61248: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883083.61353: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883083.61356: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883083.61359: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883083.61362: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883083.61381: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883083.61386: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883083.61392: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883083.61518: Set connection var ansible_connection to ssh 28983 1726883083.61532: Set connection var ansible_shell_executable to /bin/sh 28983 1726883083.61545: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883083.61556: Set connection var ansible_timeout to 10 28983 1726883083.61565: Set connection var ansible_pipelining to False 28983 1726883083.61573: Set connection var ansible_shell_type to sh 28983 1726883083.61842: variable 'ansible_shell_executable' from source: unknown 28983 1726883083.61845: variable 'ansible_connection' from source: unknown 28983 1726883083.61848: variable 'ansible_module_compression' from source: unknown 28983 1726883083.61850: variable 'ansible_shell_type' from source: unknown 28983 1726883083.61852: variable 'ansible_shell_executable' from source: unknown 28983 1726883083.61855: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883083.61857: variable 'ansible_pipelining' from source: unknown 28983 1726883083.61859: variable 'ansible_timeout' from source: unknown 28983 1726883083.61861: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883083.61885: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28983 1726883083.61894: variable 'omit' from source: magic vars 28983 1726883083.61905: starting attempt loop 28983 1726883083.61909: running the handler 28983 1726883083.61925: _low_level_execute_command(): starting 28983 1726883083.61938: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726883083.62758: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883083.62831: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883083.62849: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883083.63051: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883083.64733: stdout chunk (state=3): >>>/root <<< 28983 1726883083.64853: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883083.65063: stderr chunk (state=3): >>><<< 28983 1726883083.65067: stdout chunk (state=3): >>><<< 28983 1726883083.65093: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883083.65109: _low_level_execute_command(): starting 28983 1726883083.65117: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726883083.6509373-33088-268946310891714 `" && echo ansible-tmp-1726883083.6509373-33088-268946310891714="` echo /root/.ansible/tmp/ansible-tmp-1726883083.6509373-33088-268946310891714 `" ) && sleep 0' 28983 1726883083.66759: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883083.66987: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883083.66990: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883083.67061: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883083.67150: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883083.69236: stdout chunk (state=3): >>>ansible-tmp-1726883083.6509373-33088-268946310891714=/root/.ansible/tmp/ansible-tmp-1726883083.6509373-33088-268946310891714 <<< 28983 1726883083.69452: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883083.69547: stderr chunk (state=3): >>><<< 28983 1726883083.69550: stdout chunk (state=3): >>><<< 28983 1726883083.69642: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726883083.6509373-33088-268946310891714=/root/.ansible/tmp/ansible-tmp-1726883083.6509373-33088-268946310891714 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883083.69645: variable 'ansible_module_compression' from source: unknown 28983 1726883083.69686: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 28983 1726883083.69856: variable 'ansible_facts' from source: unknown 28983 1726883083.70283: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726883083.6509373-33088-268946310891714/AnsiballZ_package_facts.py 28983 1726883083.70724: Sending initial data 28983 1726883083.70727: Sent initial data (162 bytes) 28983 1726883083.72048: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883083.72105: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883083.72284: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883083.72347: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883083.74155: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726883083.74292: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726883083.74366: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmplamfzber /root/.ansible/tmp/ansible-tmp-1726883083.6509373-33088-268946310891714/AnsiballZ_package_facts.py <<< 28983 1726883083.74384: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726883083.6509373-33088-268946310891714/AnsiballZ_package_facts.py" <<< 28983 1726883083.74454: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmplamfzber" to remote "/root/.ansible/tmp/ansible-tmp-1726883083.6509373-33088-268946310891714/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726883083.6509373-33088-268946310891714/AnsiballZ_package_facts.py" <<< 28983 1726883083.81444: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883083.81449: stdout chunk (state=3): >>><<< 28983 1726883083.81451: stderr chunk (state=3): >>><<< 28983 1726883083.81454: done transferring module to remote 28983 1726883083.81456: _low_level_execute_command(): starting 28983 1726883083.81458: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726883083.6509373-33088-268946310891714/ /root/.ansible/tmp/ansible-tmp-1726883083.6509373-33088-268946310891714/AnsiballZ_package_facts.py && sleep 0' 28983 1726883083.82980: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726883083.82984: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration <<< 28983 1726883083.82987: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found <<< 28983 1726883083.82991: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883083.83038: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883083.83441: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883083.85220: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883083.85286: stderr chunk (state=3): >>><<< 28983 1726883083.85296: stdout chunk (state=3): >>><<< 28983 1726883083.85316: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883083.85331: _low_level_execute_command(): starting 28983 1726883083.85343: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726883083.6509373-33088-268946310891714/AnsiballZ_package_facts.py && sleep 0' 28983 1726883083.86369: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883083.86505: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found <<< 28983 1726883083.86550: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883083.86569: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883083.86585: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883083.86752: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883084.50243: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "r<<< 28983 1726883084.50265: stdout chunk (state=3): >>>pm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": <<< 28983 1726883084.50318: stdout chunk (state=3): >>>"rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, <<< 28983 1726883084.50335: stdout chunk (state=3): >>>"arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release"<<< 28983 1726883084.50342: stdout chunk (state=3): >>>: "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils",<<< 28983 1726883084.50366: stdout chunk (state=3): >>> "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null<<< 28983 1726883084.50396: stdout chunk (state=3): >>>, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "per<<< 28983 1726883084.50565: stdout chunk (state=3): >>>l-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "9.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chkconfig": [{"name": "chkconfig", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts": [{"name": "initscripts", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "network-scripts": [{"name": "network-scripts", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 28983 1726883084.52482: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. <<< 28983 1726883084.52485: stdout chunk (state=3): >>><<< 28983 1726883084.52488: stderr chunk (state=3): >>><<< 28983 1726883084.52550: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "9.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chkconfig": [{"name": "chkconfig", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts": [{"name": "initscripts", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "network-scripts": [{"name": "network-scripts", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726883084.56846: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726883083.6509373-33088-268946310891714/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726883084.56940: _low_level_execute_command(): starting 28983 1726883084.56943: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726883083.6509373-33088-268946310891714/ > /dev/null 2>&1 && sleep 0' 28983 1726883084.57591: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883084.57605: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883084.57621: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883084.57653: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883084.57759: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883084.57789: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883084.57816: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883084.57905: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883084.59981: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883084.59992: stdout chunk (state=3): >>><<< 28983 1726883084.60010: stderr chunk (state=3): >>><<< 28983 1726883084.60028: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883084.60042: handler run complete 28983 1726883084.61581: variable 'ansible_facts' from source: unknown 28983 1726883084.62427: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883084.66373: variable 'ansible_facts' from source: unknown 28983 1726883084.67149: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883084.67921: attempt loop complete, returning result 28983 1726883084.67940: _execute() done 28983 1726883084.67943: dumping result to json 28983 1726883084.68127: done dumping result, returning 28983 1726883084.68137: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affe814-3a2d-b16d-c0a7-000000001b98] 28983 1726883084.68143: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001b98 28983 1726883084.70706: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001b98 28983 1726883084.70709: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28983 1726883084.70887: no more pending results, returning what we have 28983 1726883084.70889: results queue empty 28983 1726883084.70890: checking for any_errors_fatal 28983 1726883084.70895: done checking for any_errors_fatal 28983 1726883084.70895: checking for max_fail_percentage 28983 1726883084.70897: done checking for max_fail_percentage 28983 1726883084.70897: checking to see if all hosts have failed and the running result is not ok 28983 1726883084.70898: done checking to see if all hosts have failed 28983 1726883084.70898: getting the remaining hosts for this loop 28983 1726883084.70900: done getting the remaining hosts for this loop 28983 1726883084.70903: getting the next task for host managed_node2 28983 1726883084.70909: done getting next task for host managed_node2 28983 1726883084.70911: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 28983 1726883084.70916: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883084.70928: getting variables 28983 1726883084.70929: in VariableManager get_vars() 28983 1726883084.70958: Calling all_inventory to load vars for managed_node2 28983 1726883084.70960: Calling groups_inventory to load vars for managed_node2 28983 1726883084.70962: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883084.70969: Calling all_plugins_play to load vars for managed_node2 28983 1726883084.70973: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883084.70975: Calling groups_plugins_play to load vars for managed_node2 28983 1726883084.72181: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883084.74545: done with get_vars() 28983 1726883084.74574: done getting variables 28983 1726883084.74629: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:44:44 -0400 (0:00:01.168) 0:01:54.744 ****** 28983 1726883084.74666: entering _queue_task() for managed_node2/debug 28983 1726883084.74941: worker is 1 (out of 1 available) 28983 1726883084.74956: exiting _queue_task() for managed_node2/debug 28983 1726883084.74970: done queuing things up, now waiting for results queue to drain 28983 1726883084.74972: waiting for pending results... 28983 1726883084.75176: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 28983 1726883084.75307: in run() - task 0affe814-3a2d-b16d-c0a7-000000001b3c 28983 1726883084.75322: variable 'ansible_search_path' from source: unknown 28983 1726883084.75326: variable 'ansible_search_path' from source: unknown 28983 1726883084.75382: calling self._execute() 28983 1726883084.75523: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883084.75528: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883084.75541: variable 'omit' from source: magic vars 28983 1726883084.76239: variable 'ansible_distribution_major_version' from source: facts 28983 1726883084.76243: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883084.76246: variable 'omit' from source: magic vars 28983 1726883084.76248: variable 'omit' from source: magic vars 28983 1726883084.76261: variable 'network_provider' from source: set_fact 28983 1726883084.76282: variable 'omit' from source: magic vars 28983 1726883084.76347: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883084.76395: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883084.76415: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883084.76439: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883084.76453: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883084.76488: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883084.76492: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883084.76497: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883084.76600: Set connection var ansible_connection to ssh 28983 1726883084.76613: Set connection var ansible_shell_executable to /bin/sh 28983 1726883084.76620: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883084.76636: Set connection var ansible_timeout to 10 28983 1726883084.76646: Set connection var ansible_pipelining to False 28983 1726883084.76649: Set connection var ansible_shell_type to sh 28983 1726883084.76687: variable 'ansible_shell_executable' from source: unknown 28983 1726883084.76691: variable 'ansible_connection' from source: unknown 28983 1726883084.76693: variable 'ansible_module_compression' from source: unknown 28983 1726883084.76696: variable 'ansible_shell_type' from source: unknown 28983 1726883084.76698: variable 'ansible_shell_executable' from source: unknown 28983 1726883084.76701: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883084.76703: variable 'ansible_pipelining' from source: unknown 28983 1726883084.76705: variable 'ansible_timeout' from source: unknown 28983 1726883084.76707: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883084.76868: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883084.76879: variable 'omit' from source: magic vars 28983 1726883084.76887: starting attempt loop 28983 1726883084.76890: running the handler 28983 1726883084.77059: handler run complete 28983 1726883084.77061: attempt loop complete, returning result 28983 1726883084.77064: _execute() done 28983 1726883084.77066: dumping result to json 28983 1726883084.77068: done dumping result, returning 28983 1726883084.77070: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [0affe814-3a2d-b16d-c0a7-000000001b3c] 28983 1726883084.77074: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001b3c 28983 1726883084.77144: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001b3c 28983 1726883084.77148: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: Using network provider: nm 28983 1726883084.77450: no more pending results, returning what we have 28983 1726883084.77455: results queue empty 28983 1726883084.77456: checking for any_errors_fatal 28983 1726883084.77465: done checking for any_errors_fatal 28983 1726883084.77466: checking for max_fail_percentage 28983 1726883084.77468: done checking for max_fail_percentage 28983 1726883084.77469: checking to see if all hosts have failed and the running result is not ok 28983 1726883084.77470: done checking to see if all hosts have failed 28983 1726883084.77474: getting the remaining hosts for this loop 28983 1726883084.77475: done getting the remaining hosts for this loop 28983 1726883084.77480: getting the next task for host managed_node2 28983 1726883084.77488: done getting next task for host managed_node2 28983 1726883084.77493: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 28983 1726883084.77498: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883084.77513: getting variables 28983 1726883084.77515: in VariableManager get_vars() 28983 1726883084.77561: Calling all_inventory to load vars for managed_node2 28983 1726883084.77565: Calling groups_inventory to load vars for managed_node2 28983 1726883084.77568: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883084.77580: Calling all_plugins_play to load vars for managed_node2 28983 1726883084.77584: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883084.77588: Calling groups_plugins_play to load vars for managed_node2 28983 1726883084.79190: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883084.81301: done with get_vars() 28983 1726883084.81325: done getting variables 28983 1726883084.81377: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:44:44 -0400 (0:00:00.067) 0:01:54.811 ****** 28983 1726883084.81410: entering _queue_task() for managed_node2/fail 28983 1726883084.81667: worker is 1 (out of 1 available) 28983 1726883084.81685: exiting _queue_task() for managed_node2/fail 28983 1726883084.81698: done queuing things up, now waiting for results queue to drain 28983 1726883084.81700: waiting for pending results... 28983 1726883084.81892: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 28983 1726883084.82024: in run() - task 0affe814-3a2d-b16d-c0a7-000000001b3d 28983 1726883084.82039: variable 'ansible_search_path' from source: unknown 28983 1726883084.82046: variable 'ansible_search_path' from source: unknown 28983 1726883084.82079: calling self._execute() 28983 1726883084.82165: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883084.82174: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883084.82185: variable 'omit' from source: magic vars 28983 1726883084.82509: variable 'ansible_distribution_major_version' from source: facts 28983 1726883084.82519: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883084.82630: variable 'network_state' from source: role '' defaults 28983 1726883084.82642: Evaluated conditional (network_state != {}): False 28983 1726883084.82645: when evaluation is False, skipping this task 28983 1726883084.82649: _execute() done 28983 1726883084.82652: dumping result to json 28983 1726883084.82656: done dumping result, returning 28983 1726883084.82663: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affe814-3a2d-b16d-c0a7-000000001b3d] 28983 1726883084.82670: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001b3d 28983 1726883084.82775: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001b3d 28983 1726883084.82779: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28983 1726883084.82848: no more pending results, returning what we have 28983 1726883084.82852: results queue empty 28983 1726883084.82853: checking for any_errors_fatal 28983 1726883084.82859: done checking for any_errors_fatal 28983 1726883084.82860: checking for max_fail_percentage 28983 1726883084.82861: done checking for max_fail_percentage 28983 1726883084.82863: checking to see if all hosts have failed and the running result is not ok 28983 1726883084.82864: done checking to see if all hosts have failed 28983 1726883084.82865: getting the remaining hosts for this loop 28983 1726883084.82866: done getting the remaining hosts for this loop 28983 1726883084.82873: getting the next task for host managed_node2 28983 1726883084.82882: done getting next task for host managed_node2 28983 1726883084.82887: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 28983 1726883084.82894: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883084.82918: getting variables 28983 1726883084.82920: in VariableManager get_vars() 28983 1726883084.82959: Calling all_inventory to load vars for managed_node2 28983 1726883084.82962: Calling groups_inventory to load vars for managed_node2 28983 1726883084.82965: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883084.82976: Calling all_plugins_play to load vars for managed_node2 28983 1726883084.82980: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883084.82983: Calling groups_plugins_play to load vars for managed_node2 28983 1726883084.84448: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883084.87445: done with get_vars() 28983 1726883084.87489: done getting variables 28983 1726883084.87567: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:44:44 -0400 (0:00:00.061) 0:01:54.873 ****** 28983 1726883084.87616: entering _queue_task() for managed_node2/fail 28983 1726883084.87997: worker is 1 (out of 1 available) 28983 1726883084.88013: exiting _queue_task() for managed_node2/fail 28983 1726883084.88028: done queuing things up, now waiting for results queue to drain 28983 1726883084.88031: waiting for pending results... 28983 1726883084.88594: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 28983 1726883084.88605: in run() - task 0affe814-3a2d-b16d-c0a7-000000001b3e 28983 1726883084.88610: variable 'ansible_search_path' from source: unknown 28983 1726883084.88613: variable 'ansible_search_path' from source: unknown 28983 1726883084.88631: calling self._execute() 28983 1726883084.88725: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883084.88732: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883084.88745: variable 'omit' from source: magic vars 28983 1726883084.89079: variable 'ansible_distribution_major_version' from source: facts 28983 1726883084.89091: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883084.89197: variable 'network_state' from source: role '' defaults 28983 1726883084.89210: Evaluated conditional (network_state != {}): False 28983 1726883084.89214: when evaluation is False, skipping this task 28983 1726883084.89217: _execute() done 28983 1726883084.89220: dumping result to json 28983 1726883084.89225: done dumping result, returning 28983 1726883084.89233: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affe814-3a2d-b16d-c0a7-000000001b3e] 28983 1726883084.89241: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001b3e 28983 1726883084.89338: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001b3e 28983 1726883084.89341: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28983 1726883084.89414: no more pending results, returning what we have 28983 1726883084.89417: results queue empty 28983 1726883084.89418: checking for any_errors_fatal 28983 1726883084.89425: done checking for any_errors_fatal 28983 1726883084.89426: checking for max_fail_percentage 28983 1726883084.89428: done checking for max_fail_percentage 28983 1726883084.89429: checking to see if all hosts have failed and the running result is not ok 28983 1726883084.89430: done checking to see if all hosts have failed 28983 1726883084.89431: getting the remaining hosts for this loop 28983 1726883084.89433: done getting the remaining hosts for this loop 28983 1726883084.89439: getting the next task for host managed_node2 28983 1726883084.89447: done getting next task for host managed_node2 28983 1726883084.89452: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 28983 1726883084.89458: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883084.89483: getting variables 28983 1726883084.89485: in VariableManager get_vars() 28983 1726883084.89522: Calling all_inventory to load vars for managed_node2 28983 1726883084.89526: Calling groups_inventory to load vars for managed_node2 28983 1726883084.89528: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883084.89545: Calling all_plugins_play to load vars for managed_node2 28983 1726883084.89548: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883084.89552: Calling groups_plugins_play to load vars for managed_node2 28983 1726883084.95003: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883084.96581: done with get_vars() 28983 1726883084.96605: done getting variables 28983 1726883084.96650: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:44:44 -0400 (0:00:00.090) 0:01:54.964 ****** 28983 1726883084.96676: entering _queue_task() for managed_node2/fail 28983 1726883084.96954: worker is 1 (out of 1 available) 28983 1726883084.96968: exiting _queue_task() for managed_node2/fail 28983 1726883084.96980: done queuing things up, now waiting for results queue to drain 28983 1726883084.96982: waiting for pending results... 28983 1726883084.97190: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 28983 1726883084.97333: in run() - task 0affe814-3a2d-b16d-c0a7-000000001b3f 28983 1726883084.97347: variable 'ansible_search_path' from source: unknown 28983 1726883084.97351: variable 'ansible_search_path' from source: unknown 28983 1726883084.97386: calling self._execute() 28983 1726883084.97477: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883084.97486: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883084.97497: variable 'omit' from source: magic vars 28983 1726883084.97833: variable 'ansible_distribution_major_version' from source: facts 28983 1726883084.97845: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883084.98005: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883084.99852: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883084.99914: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883084.99949: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883084.99984: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883085.00008: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883085.00082: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883085.00106: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883085.00128: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883085.00165: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883085.00239: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883085.00306: variable 'ansible_distribution_major_version' from source: facts 28983 1726883085.00326: Evaluated conditional (ansible_distribution_major_version | int > 9): True 28983 1726883085.00473: variable 'ansible_distribution' from source: facts 28983 1726883085.00486: variable '__network_rh_distros' from source: role '' defaults 28983 1726883085.00500: Evaluated conditional (ansible_distribution in __network_rh_distros): False 28983 1726883085.00509: when evaluation is False, skipping this task 28983 1726883085.00516: _execute() done 28983 1726883085.00524: dumping result to json 28983 1726883085.00533: done dumping result, returning 28983 1726883085.00651: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affe814-3a2d-b16d-c0a7-000000001b3f] 28983 1726883085.00657: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001b3f 28983 1726883085.00738: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001b3f 28983 1726883085.00742: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution in __network_rh_distros", "skip_reason": "Conditional result was False" } 28983 1726883085.00804: no more pending results, returning what we have 28983 1726883085.00808: results queue empty 28983 1726883085.00809: checking for any_errors_fatal 28983 1726883085.00819: done checking for any_errors_fatal 28983 1726883085.00820: checking for max_fail_percentage 28983 1726883085.00822: done checking for max_fail_percentage 28983 1726883085.00823: checking to see if all hosts have failed and the running result is not ok 28983 1726883085.00823: done checking to see if all hosts have failed 28983 1726883085.00824: getting the remaining hosts for this loop 28983 1726883085.00826: done getting the remaining hosts for this loop 28983 1726883085.00831: getting the next task for host managed_node2 28983 1726883085.00842: done getting next task for host managed_node2 28983 1726883085.00846: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 28983 1726883085.00853: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883085.01002: getting variables 28983 1726883085.01004: in VariableManager get_vars() 28983 1726883085.01045: Calling all_inventory to load vars for managed_node2 28983 1726883085.01048: Calling groups_inventory to load vars for managed_node2 28983 1726883085.01051: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883085.01060: Calling all_plugins_play to load vars for managed_node2 28983 1726883085.01063: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883085.01066: Calling groups_plugins_play to load vars for managed_node2 28983 1726883085.02552: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883085.04868: done with get_vars() 28983 1726883085.04908: done getting variables 28983 1726883085.04981: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:44:45 -0400 (0:00:00.083) 0:01:55.047 ****** 28983 1726883085.05020: entering _queue_task() for managed_node2/dnf 28983 1726883085.05378: worker is 1 (out of 1 available) 28983 1726883085.05392: exiting _queue_task() for managed_node2/dnf 28983 1726883085.05405: done queuing things up, now waiting for results queue to drain 28983 1726883085.05407: waiting for pending results... 28983 1726883085.05766: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 28983 1726883085.06041: in run() - task 0affe814-3a2d-b16d-c0a7-000000001b40 28983 1726883085.06045: variable 'ansible_search_path' from source: unknown 28983 1726883085.06048: variable 'ansible_search_path' from source: unknown 28983 1726883085.06051: calling self._execute() 28983 1726883085.06127: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883085.06145: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883085.06166: variable 'omit' from source: magic vars 28983 1726883085.06644: variable 'ansible_distribution_major_version' from source: facts 28983 1726883085.06663: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883085.06946: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883085.09764: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883085.09861: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883085.09920: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883085.09981: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883085.10039: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883085.10120: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883085.10160: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883085.10200: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883085.10439: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883085.10443: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883085.10446: variable 'ansible_distribution' from source: facts 28983 1726883085.10449: variable 'ansible_distribution_major_version' from source: facts 28983 1726883085.10451: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 28983 1726883085.10586: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883085.10782: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883085.10820: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883085.10860: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883085.10923: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883085.11008: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883085.11014: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883085.11053: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883085.11092: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883085.11155: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883085.11181: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883085.11245: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883085.11284: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883085.11323: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883085.11387: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883085.11442: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883085.11639: variable 'network_connections' from source: include params 28983 1726883085.11662: variable 'interface' from source: play vars 28983 1726883085.11745: variable 'interface' from source: play vars 28983 1726883085.11847: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883085.12097: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883085.12132: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883085.12178: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883085.12222: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883085.12283: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883085.12424: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883085.12437: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883085.12440: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883085.12467: variable '__network_team_connections_defined' from source: role '' defaults 28983 1726883085.12820: variable 'network_connections' from source: include params 28983 1726883085.12831: variable 'interface' from source: play vars 28983 1726883085.12917: variable 'interface' from source: play vars 28983 1726883085.12952: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 28983 1726883085.12965: when evaluation is False, skipping this task 28983 1726883085.12979: _execute() done 28983 1726883085.12988: dumping result to json 28983 1726883085.12998: done dumping result, returning 28983 1726883085.13011: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affe814-3a2d-b16d-c0a7-000000001b40] 28983 1726883085.13023: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001b40 28983 1726883085.13342: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001b40 28983 1726883085.13346: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 28983 1726883085.13401: no more pending results, returning what we have 28983 1726883085.13404: results queue empty 28983 1726883085.13405: checking for any_errors_fatal 28983 1726883085.13412: done checking for any_errors_fatal 28983 1726883085.13413: checking for max_fail_percentage 28983 1726883085.13415: done checking for max_fail_percentage 28983 1726883085.13416: checking to see if all hosts have failed and the running result is not ok 28983 1726883085.13417: done checking to see if all hosts have failed 28983 1726883085.13418: getting the remaining hosts for this loop 28983 1726883085.13420: done getting the remaining hosts for this loop 28983 1726883085.13424: getting the next task for host managed_node2 28983 1726883085.13432: done getting next task for host managed_node2 28983 1726883085.13438: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 28983 1726883085.13445: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883085.13478: getting variables 28983 1726883085.13480: in VariableManager get_vars() 28983 1726883085.13530: Calling all_inventory to load vars for managed_node2 28983 1726883085.13705: Calling groups_inventory to load vars for managed_node2 28983 1726883085.13709: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883085.13719: Calling all_plugins_play to load vars for managed_node2 28983 1726883085.13723: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883085.13727: Calling groups_plugins_play to load vars for managed_node2 28983 1726883085.15970: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883085.19076: done with get_vars() 28983 1726883085.19111: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 28983 1726883085.19201: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:44:45 -0400 (0:00:00.142) 0:01:55.190 ****** 28983 1726883085.19241: entering _queue_task() for managed_node2/yum 28983 1726883085.19586: worker is 1 (out of 1 available) 28983 1726883085.19600: exiting _queue_task() for managed_node2/yum 28983 1726883085.19612: done queuing things up, now waiting for results queue to drain 28983 1726883085.19614: waiting for pending results... 28983 1726883085.19938: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 28983 1726883085.20151: in run() - task 0affe814-3a2d-b16d-c0a7-000000001b41 28983 1726883085.20183: variable 'ansible_search_path' from source: unknown 28983 1726883085.20192: variable 'ansible_search_path' from source: unknown 28983 1726883085.20241: calling self._execute() 28983 1726883085.20359: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883085.20376: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883085.20399: variable 'omit' from source: magic vars 28983 1726883085.21041: variable 'ansible_distribution_major_version' from source: facts 28983 1726883085.21045: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883085.21123: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883085.24392: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883085.24484: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883085.24550: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883085.24601: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883085.24644: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883085.24747: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883085.24792: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883085.24830: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883085.24899: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883085.24920: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883085.25045: variable 'ansible_distribution_major_version' from source: facts 28983 1726883085.25073: Evaluated conditional (ansible_distribution_major_version | int < 8): False 28983 1726883085.25082: when evaluation is False, skipping this task 28983 1726883085.25090: _execute() done 28983 1726883085.25098: dumping result to json 28983 1726883085.25106: done dumping result, returning 28983 1726883085.25118: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affe814-3a2d-b16d-c0a7-000000001b41] 28983 1726883085.25128: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001b41 skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 28983 1726883085.25314: no more pending results, returning what we have 28983 1726883085.25318: results queue empty 28983 1726883085.25319: checking for any_errors_fatal 28983 1726883085.25328: done checking for any_errors_fatal 28983 1726883085.25329: checking for max_fail_percentage 28983 1726883085.25331: done checking for max_fail_percentage 28983 1726883085.25332: checking to see if all hosts have failed and the running result is not ok 28983 1726883085.25333: done checking to see if all hosts have failed 28983 1726883085.25336: getting the remaining hosts for this loop 28983 1726883085.25338: done getting the remaining hosts for this loop 28983 1726883085.25343: getting the next task for host managed_node2 28983 1726883085.25353: done getting next task for host managed_node2 28983 1726883085.25358: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 28983 1726883085.25365: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883085.25400: getting variables 28983 1726883085.25402: in VariableManager get_vars() 28983 1726883085.25558: Calling all_inventory to load vars for managed_node2 28983 1726883085.25562: Calling groups_inventory to load vars for managed_node2 28983 1726883085.25565: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883085.25578: Calling all_plugins_play to load vars for managed_node2 28983 1726883085.25582: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883085.25586: Calling groups_plugins_play to load vars for managed_node2 28983 1726883085.26251: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001b41 28983 1726883085.26255: WORKER PROCESS EXITING 28983 1726883085.28340: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883085.31390: done with get_vars() 28983 1726883085.31425: done getting variables 28983 1726883085.31496: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:44:45 -0400 (0:00:00.122) 0:01:55.313 ****** 28983 1726883085.31540: entering _queue_task() for managed_node2/fail 28983 1726883085.31896: worker is 1 (out of 1 available) 28983 1726883085.31910: exiting _queue_task() for managed_node2/fail 28983 1726883085.31924: done queuing things up, now waiting for results queue to drain 28983 1726883085.31926: waiting for pending results... 28983 1726883085.32251: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 28983 1726883085.32456: in run() - task 0affe814-3a2d-b16d-c0a7-000000001b42 28983 1726883085.32487: variable 'ansible_search_path' from source: unknown 28983 1726883085.32496: variable 'ansible_search_path' from source: unknown 28983 1726883085.32542: calling self._execute() 28983 1726883085.32665: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883085.32681: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883085.32703: variable 'omit' from source: magic vars 28983 1726883085.33166: variable 'ansible_distribution_major_version' from source: facts 28983 1726883085.33187: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883085.33358: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883085.33636: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883085.36433: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883085.36530: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883085.36590: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883085.36640: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883085.36685: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883085.36891: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883085.36895: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883085.36898: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883085.36935: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883085.36962: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883085.37032: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883085.37068: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883085.37109: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883085.37162: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883085.37187: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883085.37246: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883085.37281: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883085.37315: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883085.37370: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883085.37395: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883085.37619: variable 'network_connections' from source: include params 28983 1726883085.37641: variable 'interface' from source: play vars 28983 1726883085.37760: variable 'interface' from source: play vars 28983 1726883085.37830: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883085.38028: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883085.38094: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883085.38138: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883085.38197: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883085.38246: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883085.38307: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883085.38327: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883085.38373: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883085.38444: variable '__network_team_connections_defined' from source: role '' defaults 28983 1726883085.38853: variable 'network_connections' from source: include params 28983 1726883085.38857: variable 'interface' from source: play vars 28983 1726883085.38901: variable 'interface' from source: play vars 28983 1726883085.38932: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 28983 1726883085.38945: when evaluation is False, skipping this task 28983 1726883085.38953: _execute() done 28983 1726883085.38968: dumping result to json 28983 1726883085.38981: done dumping result, returning 28983 1726883085.38995: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affe814-3a2d-b16d-c0a7-000000001b42] 28983 1726883085.39074: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001b42 28983 1726883085.39175: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001b42 28983 1726883085.39180: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 28983 1726883085.39248: no more pending results, returning what we have 28983 1726883085.39252: results queue empty 28983 1726883085.39253: checking for any_errors_fatal 28983 1726883085.39263: done checking for any_errors_fatal 28983 1726883085.39264: checking for max_fail_percentage 28983 1726883085.39267: done checking for max_fail_percentage 28983 1726883085.39268: checking to see if all hosts have failed and the running result is not ok 28983 1726883085.39269: done checking to see if all hosts have failed 28983 1726883085.39270: getting the remaining hosts for this loop 28983 1726883085.39275: done getting the remaining hosts for this loop 28983 1726883085.39281: getting the next task for host managed_node2 28983 1726883085.39291: done getting next task for host managed_node2 28983 1726883085.39296: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 28983 1726883085.39304: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883085.39646: getting variables 28983 1726883085.39649: in VariableManager get_vars() 28983 1726883085.39700: Calling all_inventory to load vars for managed_node2 28983 1726883085.39703: Calling groups_inventory to load vars for managed_node2 28983 1726883085.39706: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883085.39717: Calling all_plugins_play to load vars for managed_node2 28983 1726883085.39721: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883085.39725: Calling groups_plugins_play to load vars for managed_node2 28983 1726883085.42036: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883085.45082: done with get_vars() 28983 1726883085.45121: done getting variables 28983 1726883085.45199: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:44:45 -0400 (0:00:00.137) 0:01:55.450 ****** 28983 1726883085.45248: entering _queue_task() for managed_node2/package 28983 1726883085.45653: worker is 1 (out of 1 available) 28983 1726883085.45667: exiting _queue_task() for managed_node2/package 28983 1726883085.45684: done queuing things up, now waiting for results queue to drain 28983 1726883085.45686: waiting for pending results... 28983 1726883085.46156: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 28983 1726883085.46228: in run() - task 0affe814-3a2d-b16d-c0a7-000000001b43 28983 1726883085.46257: variable 'ansible_search_path' from source: unknown 28983 1726883085.46266: variable 'ansible_search_path' from source: unknown 28983 1726883085.46314: calling self._execute() 28983 1726883085.46437: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883085.46451: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883085.46539: variable 'omit' from source: magic vars 28983 1726883085.46943: variable 'ansible_distribution_major_version' from source: facts 28983 1726883085.46961: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883085.47207: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883085.47522: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883085.47586: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883085.47629: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883085.47719: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883085.47858: variable 'network_packages' from source: role '' defaults 28983 1726883085.48002: variable '__network_provider_setup' from source: role '' defaults 28983 1726883085.48021: variable '__network_service_name_default_nm' from source: role '' defaults 28983 1726883085.48115: variable '__network_service_name_default_nm' from source: role '' defaults 28983 1726883085.48216: variable '__network_packages_default_nm' from source: role '' defaults 28983 1726883085.48219: variable '__network_packages_default_nm' from source: role '' defaults 28983 1726883085.48501: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883085.51056: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883085.51147: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883085.51203: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883085.51250: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883085.51296: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883085.51414: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883085.51459: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883085.51504: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883085.51565: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883085.51593: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883085.51648: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883085.51668: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883085.51694: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883085.51727: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883085.51741: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883085.51936: variable '__network_packages_default_gobject_packages' from source: role '' defaults 28983 1726883085.52049: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883085.52070: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883085.52094: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883085.52126: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883085.52140: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883085.52221: variable 'ansible_python' from source: facts 28983 1726883085.52237: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 28983 1726883085.52308: variable '__network_wpa_supplicant_required' from source: role '' defaults 28983 1726883085.52375: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 28983 1726883085.52485: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883085.52508: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883085.52528: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883085.52561: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883085.52577: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883085.52617: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883085.52642: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883085.52662: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883085.52698: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883085.52710: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883085.52830: variable 'network_connections' from source: include params 28983 1726883085.52838: variable 'interface' from source: play vars 28983 1726883085.52923: variable 'interface' from source: play vars 28983 1726883085.53013: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883085.53027: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883085.53056: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883085.53084: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883085.53126: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883085.53436: variable 'network_connections' from source: include params 28983 1726883085.53440: variable 'interface' from source: play vars 28983 1726883085.53549: variable 'interface' from source: play vars 28983 1726883085.53575: variable '__network_packages_default_wireless' from source: role '' defaults 28983 1726883085.53641: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883085.53896: variable 'network_connections' from source: include params 28983 1726883085.53901: variable 'interface' from source: play vars 28983 1726883085.53955: variable 'interface' from source: play vars 28983 1726883085.53982: variable '__network_packages_default_team' from source: role '' defaults 28983 1726883085.54045: variable '__network_team_connections_defined' from source: role '' defaults 28983 1726883085.54444: variable 'network_connections' from source: include params 28983 1726883085.54448: variable 'interface' from source: play vars 28983 1726883085.54450: variable 'interface' from source: play vars 28983 1726883085.54494: variable '__network_service_name_default_initscripts' from source: role '' defaults 28983 1726883085.54563: variable '__network_service_name_default_initscripts' from source: role '' defaults 28983 1726883085.54570: variable '__network_packages_default_initscripts' from source: role '' defaults 28983 1726883085.54644: variable '__network_packages_default_initscripts' from source: role '' defaults 28983 1726883085.54927: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 28983 1726883085.55324: variable 'network_connections' from source: include params 28983 1726883085.55329: variable 'interface' from source: play vars 28983 1726883085.55383: variable 'interface' from source: play vars 28983 1726883085.55390: variable 'ansible_distribution' from source: facts 28983 1726883085.55395: variable '__network_rh_distros' from source: role '' defaults 28983 1726883085.55402: variable 'ansible_distribution_major_version' from source: facts 28983 1726883085.55414: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 28983 1726883085.55555: variable 'ansible_distribution' from source: facts 28983 1726883085.55559: variable '__network_rh_distros' from source: role '' defaults 28983 1726883085.55565: variable 'ansible_distribution_major_version' from source: facts 28983 1726883085.55572: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 28983 1726883085.55712: variable 'ansible_distribution' from source: facts 28983 1726883085.55716: variable '__network_rh_distros' from source: role '' defaults 28983 1726883085.55722: variable 'ansible_distribution_major_version' from source: facts 28983 1726883085.55753: variable 'network_provider' from source: set_fact 28983 1726883085.55767: variable 'ansible_facts' from source: unknown 28983 1726883085.56470: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 28983 1726883085.56474: when evaluation is False, skipping this task 28983 1726883085.56479: _execute() done 28983 1726883085.56483: dumping result to json 28983 1726883085.56489: done dumping result, returning 28983 1726883085.56497: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [0affe814-3a2d-b16d-c0a7-000000001b43] 28983 1726883085.56503: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001b43 28983 1726883085.56605: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001b43 28983 1726883085.56609: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 28983 1726883085.56673: no more pending results, returning what we have 28983 1726883085.56677: results queue empty 28983 1726883085.56678: checking for any_errors_fatal 28983 1726883085.56688: done checking for any_errors_fatal 28983 1726883085.56689: checking for max_fail_percentage 28983 1726883085.56691: done checking for max_fail_percentage 28983 1726883085.56692: checking to see if all hosts have failed and the running result is not ok 28983 1726883085.56692: done checking to see if all hosts have failed 28983 1726883085.56693: getting the remaining hosts for this loop 28983 1726883085.56696: done getting the remaining hosts for this loop 28983 1726883085.56700: getting the next task for host managed_node2 28983 1726883085.56709: done getting next task for host managed_node2 28983 1726883085.56713: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 28983 1726883085.56720: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883085.56752: getting variables 28983 1726883085.56754: in VariableManager get_vars() 28983 1726883085.56804: Calling all_inventory to load vars for managed_node2 28983 1726883085.56807: Calling groups_inventory to load vars for managed_node2 28983 1726883085.56810: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883085.56820: Calling all_plugins_play to load vars for managed_node2 28983 1726883085.56823: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883085.56827: Calling groups_plugins_play to load vars for managed_node2 28983 1726883085.58105: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883085.59718: done with get_vars() 28983 1726883085.59746: done getting variables 28983 1726883085.59796: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:44:45 -0400 (0:00:00.145) 0:01:55.596 ****** 28983 1726883085.59830: entering _queue_task() for managed_node2/package 28983 1726883085.60091: worker is 1 (out of 1 available) 28983 1726883085.60106: exiting _queue_task() for managed_node2/package 28983 1726883085.60120: done queuing things up, now waiting for results queue to drain 28983 1726883085.60122: waiting for pending results... 28983 1726883085.60340: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 28983 1726883085.60475: in run() - task 0affe814-3a2d-b16d-c0a7-000000001b44 28983 1726883085.60488: variable 'ansible_search_path' from source: unknown 28983 1726883085.60492: variable 'ansible_search_path' from source: unknown 28983 1726883085.60525: calling self._execute() 28983 1726883085.60611: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883085.60617: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883085.60627: variable 'omit' from source: magic vars 28983 1726883085.61140: variable 'ansible_distribution_major_version' from source: facts 28983 1726883085.61144: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883085.61147: variable 'network_state' from source: role '' defaults 28983 1726883085.61150: Evaluated conditional (network_state != {}): False 28983 1726883085.61153: when evaluation is False, skipping this task 28983 1726883085.61155: _execute() done 28983 1726883085.61158: dumping result to json 28983 1726883085.61160: done dumping result, returning 28983 1726883085.61163: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affe814-3a2d-b16d-c0a7-000000001b44] 28983 1726883085.61165: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001b44 28983 1726883085.61258: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001b44 28983 1726883085.61260: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28983 1726883085.61327: no more pending results, returning what we have 28983 1726883085.61332: results queue empty 28983 1726883085.61333: checking for any_errors_fatal 28983 1726883085.61341: done checking for any_errors_fatal 28983 1726883085.61342: checking for max_fail_percentage 28983 1726883085.61344: done checking for max_fail_percentage 28983 1726883085.61345: checking to see if all hosts have failed and the running result is not ok 28983 1726883085.61345: done checking to see if all hosts have failed 28983 1726883085.61346: getting the remaining hosts for this loop 28983 1726883085.61348: done getting the remaining hosts for this loop 28983 1726883085.61352: getting the next task for host managed_node2 28983 1726883085.61361: done getting next task for host managed_node2 28983 1726883085.61365: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 28983 1726883085.61374: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883085.61399: getting variables 28983 1726883085.61401: in VariableManager get_vars() 28983 1726883085.61444: Calling all_inventory to load vars for managed_node2 28983 1726883085.61447: Calling groups_inventory to load vars for managed_node2 28983 1726883085.61450: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883085.61458: Calling all_plugins_play to load vars for managed_node2 28983 1726883085.61462: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883085.61465: Calling groups_plugins_play to load vars for managed_node2 28983 1726883085.63546: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883085.65116: done with get_vars() 28983 1726883085.65139: done getting variables 28983 1726883085.65187: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:44:45 -0400 (0:00:00.053) 0:01:55.649 ****** 28983 1726883085.65216: entering _queue_task() for managed_node2/package 28983 1726883085.65432: worker is 1 (out of 1 available) 28983 1726883085.65447: exiting _queue_task() for managed_node2/package 28983 1726883085.65462: done queuing things up, now waiting for results queue to drain 28983 1726883085.65464: waiting for pending results... 28983 1726883085.65648: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 28983 1726883085.65763: in run() - task 0affe814-3a2d-b16d-c0a7-000000001b45 28983 1726883085.65777: variable 'ansible_search_path' from source: unknown 28983 1726883085.65784: variable 'ansible_search_path' from source: unknown 28983 1726883085.65814: calling self._execute() 28983 1726883085.65897: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883085.65903: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883085.65915: variable 'omit' from source: magic vars 28983 1726883085.66220: variable 'ansible_distribution_major_version' from source: facts 28983 1726883085.66232: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883085.66331: variable 'network_state' from source: role '' defaults 28983 1726883085.66344: Evaluated conditional (network_state != {}): False 28983 1726883085.66349: when evaluation is False, skipping this task 28983 1726883085.66352: _execute() done 28983 1726883085.66355: dumping result to json 28983 1726883085.66358: done dumping result, returning 28983 1726883085.66368: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affe814-3a2d-b16d-c0a7-000000001b45] 28983 1726883085.66376: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001b45 28983 1726883085.66484: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001b45 28983 1726883085.66487: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28983 1726883085.66540: no more pending results, returning what we have 28983 1726883085.66544: results queue empty 28983 1726883085.66545: checking for any_errors_fatal 28983 1726883085.66550: done checking for any_errors_fatal 28983 1726883085.66551: checking for max_fail_percentage 28983 1726883085.66553: done checking for max_fail_percentage 28983 1726883085.66554: checking to see if all hosts have failed and the running result is not ok 28983 1726883085.66555: done checking to see if all hosts have failed 28983 1726883085.66556: getting the remaining hosts for this loop 28983 1726883085.66558: done getting the remaining hosts for this loop 28983 1726883085.66562: getting the next task for host managed_node2 28983 1726883085.66569: done getting next task for host managed_node2 28983 1726883085.66576: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 28983 1726883085.66582: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883085.66608: getting variables 28983 1726883085.66610: in VariableManager get_vars() 28983 1726883085.66648: Calling all_inventory to load vars for managed_node2 28983 1726883085.66650: Calling groups_inventory to load vars for managed_node2 28983 1726883085.66652: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883085.66659: Calling all_plugins_play to load vars for managed_node2 28983 1726883085.66661: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883085.66663: Calling groups_plugins_play to load vars for managed_node2 28983 1726883085.67848: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883085.69528: done with get_vars() 28983 1726883085.69552: done getting variables 28983 1726883085.69597: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:44:45 -0400 (0:00:00.044) 0:01:55.694 ****** 28983 1726883085.69624: entering _queue_task() for managed_node2/service 28983 1726883085.69821: worker is 1 (out of 1 available) 28983 1726883085.69836: exiting _queue_task() for managed_node2/service 28983 1726883085.69849: done queuing things up, now waiting for results queue to drain 28983 1726883085.69851: waiting for pending results... 28983 1726883085.70050: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 28983 1726883085.70167: in run() - task 0affe814-3a2d-b16d-c0a7-000000001b46 28983 1726883085.70184: variable 'ansible_search_path' from source: unknown 28983 1726883085.70188: variable 'ansible_search_path' from source: unknown 28983 1726883085.70221: calling self._execute() 28983 1726883085.70305: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883085.70314: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883085.70324: variable 'omit' from source: magic vars 28983 1726883085.70631: variable 'ansible_distribution_major_version' from source: facts 28983 1726883085.70646: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883085.70746: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883085.70917: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883085.72669: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883085.72736: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883085.72767: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883085.72804: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883085.72829: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883085.72897: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883085.72920: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883085.72946: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883085.72981: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883085.72995: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883085.73037: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883085.73060: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883085.73084: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883085.73115: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883085.73127: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883085.73166: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883085.73188: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883085.73209: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883085.73242: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883085.73257: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883085.73399: variable 'network_connections' from source: include params 28983 1726883085.73410: variable 'interface' from source: play vars 28983 1726883085.73464: variable 'interface' from source: play vars 28983 1726883085.73525: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883085.73657: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883085.73701: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883085.73728: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883085.73755: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883085.73794: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883085.73814: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883085.73836: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883085.73858: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883085.73903: variable '__network_team_connections_defined' from source: role '' defaults 28983 1726883085.74108: variable 'network_connections' from source: include params 28983 1726883085.74112: variable 'interface' from source: play vars 28983 1726883085.74165: variable 'interface' from source: play vars 28983 1726883085.74187: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 28983 1726883085.74191: when evaluation is False, skipping this task 28983 1726883085.74194: _execute() done 28983 1726883085.74197: dumping result to json 28983 1726883085.74202: done dumping result, returning 28983 1726883085.74209: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affe814-3a2d-b16d-c0a7-000000001b46] 28983 1726883085.74214: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001b46 28983 1726883085.74312: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001b46 28983 1726883085.74322: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 28983 1726883085.74379: no more pending results, returning what we have 28983 1726883085.74382: results queue empty 28983 1726883085.74383: checking for any_errors_fatal 28983 1726883085.74390: done checking for any_errors_fatal 28983 1726883085.74391: checking for max_fail_percentage 28983 1726883085.74393: done checking for max_fail_percentage 28983 1726883085.74394: checking to see if all hosts have failed and the running result is not ok 28983 1726883085.74395: done checking to see if all hosts have failed 28983 1726883085.74396: getting the remaining hosts for this loop 28983 1726883085.74398: done getting the remaining hosts for this loop 28983 1726883085.74402: getting the next task for host managed_node2 28983 1726883085.74410: done getting next task for host managed_node2 28983 1726883085.74414: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 28983 1726883085.74420: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883085.74445: getting variables 28983 1726883085.74447: in VariableManager get_vars() 28983 1726883085.74485: Calling all_inventory to load vars for managed_node2 28983 1726883085.74488: Calling groups_inventory to load vars for managed_node2 28983 1726883085.74491: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883085.74499: Calling all_plugins_play to load vars for managed_node2 28983 1726883085.74502: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883085.74506: Calling groups_plugins_play to load vars for managed_node2 28983 1726883085.75703: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883085.77281: done with get_vars() 28983 1726883085.77303: done getting variables 28983 1726883085.77351: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:44:45 -0400 (0:00:00.077) 0:01:55.771 ****** 28983 1726883085.77376: entering _queue_task() for managed_node2/service 28983 1726883085.77576: worker is 1 (out of 1 available) 28983 1726883085.77591: exiting _queue_task() for managed_node2/service 28983 1726883085.77605: done queuing things up, now waiting for results queue to drain 28983 1726883085.77607: waiting for pending results... 28983 1726883085.77790: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 28983 1726883085.77906: in run() - task 0affe814-3a2d-b16d-c0a7-000000001b47 28983 1726883085.77919: variable 'ansible_search_path' from source: unknown 28983 1726883085.77922: variable 'ansible_search_path' from source: unknown 28983 1726883085.77957: calling self._execute() 28983 1726883085.78039: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883085.78047: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883085.78058: variable 'omit' from source: magic vars 28983 1726883085.78366: variable 'ansible_distribution_major_version' from source: facts 28983 1726883085.78384: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883085.78518: variable 'network_provider' from source: set_fact 28983 1726883085.78522: variable 'network_state' from source: role '' defaults 28983 1726883085.78536: Evaluated conditional (network_provider == "nm" or network_state != {}): True 28983 1726883085.78542: variable 'omit' from source: magic vars 28983 1726883085.78596: variable 'omit' from source: magic vars 28983 1726883085.78619: variable 'network_service_name' from source: role '' defaults 28983 1726883085.78675: variable 'network_service_name' from source: role '' defaults 28983 1726883085.78765: variable '__network_provider_setup' from source: role '' defaults 28983 1726883085.78770: variable '__network_service_name_default_nm' from source: role '' defaults 28983 1726883085.78825: variable '__network_service_name_default_nm' from source: role '' defaults 28983 1726883085.78833: variable '__network_packages_default_nm' from source: role '' defaults 28983 1726883085.78888: variable '__network_packages_default_nm' from source: role '' defaults 28983 1726883085.79082: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883085.81057: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883085.81110: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883085.81153: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883085.81187: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883085.81210: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883085.81284: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883085.81308: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883085.81329: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883085.81369: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883085.81384: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883085.81423: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883085.81446: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883085.81469: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883085.81501: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883085.81514: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883085.81702: variable '__network_packages_default_gobject_packages' from source: role '' defaults 28983 1726883085.81798: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883085.81817: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883085.81838: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883085.81869: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883085.81885: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883085.81960: variable 'ansible_python' from source: facts 28983 1726883085.81976: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 28983 1726883085.82044: variable '__network_wpa_supplicant_required' from source: role '' defaults 28983 1726883085.82110: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 28983 1726883085.82218: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883085.82240: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883085.82261: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883085.82293: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883085.82305: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883085.82349: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883085.82375: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883085.82394: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883085.82426: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883085.82443: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883085.82555: variable 'network_connections' from source: include params 28983 1726883085.82563: variable 'interface' from source: play vars 28983 1726883085.82624: variable 'interface' from source: play vars 28983 1726883085.82711: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883085.82849: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883085.82902: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883085.82938: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883085.82976: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883085.83024: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883085.83050: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883085.83079: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883085.83108: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883085.83149: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883085.83407: variable 'network_connections' from source: include params 28983 1726883085.83410: variable 'interface' from source: play vars 28983 1726883085.83448: variable 'interface' from source: play vars 28983 1726883085.83477: variable '__network_packages_default_wireless' from source: role '' defaults 28983 1726883085.83545: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883085.83781: variable 'network_connections' from source: include params 28983 1726883085.83784: variable 'interface' from source: play vars 28983 1726883085.83843: variable 'interface' from source: play vars 28983 1726883085.83864: variable '__network_packages_default_team' from source: role '' defaults 28983 1726883085.83926: variable '__network_team_connections_defined' from source: role '' defaults 28983 1726883085.84166: variable 'network_connections' from source: include params 28983 1726883085.84169: variable 'interface' from source: play vars 28983 1726883085.84229: variable 'interface' from source: play vars 28983 1726883085.84275: variable '__network_service_name_default_initscripts' from source: role '' defaults 28983 1726883085.84324: variable '__network_service_name_default_initscripts' from source: role '' defaults 28983 1726883085.84331: variable '__network_packages_default_initscripts' from source: role '' defaults 28983 1726883085.84382: variable '__network_packages_default_initscripts' from source: role '' defaults 28983 1726883085.84565: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 28983 1726883085.84959: variable 'network_connections' from source: include params 28983 1726883085.84962: variable 'interface' from source: play vars 28983 1726883085.85013: variable 'interface' from source: play vars 28983 1726883085.85020: variable 'ansible_distribution' from source: facts 28983 1726883085.85024: variable '__network_rh_distros' from source: role '' defaults 28983 1726883085.85031: variable 'ansible_distribution_major_version' from source: facts 28983 1726883085.85047: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 28983 1726883085.85189: variable 'ansible_distribution' from source: facts 28983 1726883085.85193: variable '__network_rh_distros' from source: role '' defaults 28983 1726883085.85199: variable 'ansible_distribution_major_version' from source: facts 28983 1726883085.85206: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 28983 1726883085.85350: variable 'ansible_distribution' from source: facts 28983 1726883085.85353: variable '__network_rh_distros' from source: role '' defaults 28983 1726883085.85359: variable 'ansible_distribution_major_version' from source: facts 28983 1726883085.85392: variable 'network_provider' from source: set_fact 28983 1726883085.85411: variable 'omit' from source: magic vars 28983 1726883085.85437: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883085.85461: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883085.85478: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883085.85496: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883085.85505: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883085.85531: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883085.85536: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883085.85541: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883085.85618: Set connection var ansible_connection to ssh 28983 1726883085.85629: Set connection var ansible_shell_executable to /bin/sh 28983 1726883085.85639: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883085.85648: Set connection var ansible_timeout to 10 28983 1726883085.85654: Set connection var ansible_pipelining to False 28983 1726883085.85656: Set connection var ansible_shell_type to sh 28983 1726883085.85677: variable 'ansible_shell_executable' from source: unknown 28983 1726883085.85681: variable 'ansible_connection' from source: unknown 28983 1726883085.85684: variable 'ansible_module_compression' from source: unknown 28983 1726883085.85688: variable 'ansible_shell_type' from source: unknown 28983 1726883085.85691: variable 'ansible_shell_executable' from source: unknown 28983 1726883085.85696: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883085.85700: variable 'ansible_pipelining' from source: unknown 28983 1726883085.85704: variable 'ansible_timeout' from source: unknown 28983 1726883085.85712: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883085.85793: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883085.85803: variable 'omit' from source: magic vars 28983 1726883085.85810: starting attempt loop 28983 1726883085.85814: running the handler 28983 1726883085.85881: variable 'ansible_facts' from source: unknown 28983 1726883085.86484: _low_level_execute_command(): starting 28983 1726883085.86491: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726883085.87016: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883085.87022: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883085.87025: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883085.87027: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883085.87090: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883085.87098: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883085.87101: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883085.87175: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883085.88964: stdout chunk (state=3): >>>/root <<< 28983 1726883085.89177: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883085.89181: stdout chunk (state=3): >>><<< 28983 1726883085.89183: stderr chunk (state=3): >>><<< 28983 1726883085.89202: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883085.89223: _low_level_execute_command(): starting 28983 1726883085.89240: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726883085.8920982-33178-266340757145637 `" && echo ansible-tmp-1726883085.8920982-33178-266340757145637="` echo /root/.ansible/tmp/ansible-tmp-1726883085.8920982-33178-266340757145637 `" ) && sleep 0' 28983 1726883085.89892: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883085.89908: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883085.89935: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883085.90047: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883085.90074: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883085.90187: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883085.92244: stdout chunk (state=3): >>>ansible-tmp-1726883085.8920982-33178-266340757145637=/root/.ansible/tmp/ansible-tmp-1726883085.8920982-33178-266340757145637 <<< 28983 1726883085.92540: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883085.92544: stdout chunk (state=3): >>><<< 28983 1726883085.92548: stderr chunk (state=3): >>><<< 28983 1726883085.92552: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726883085.8920982-33178-266340757145637=/root/.ansible/tmp/ansible-tmp-1726883085.8920982-33178-266340757145637 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883085.92554: variable 'ansible_module_compression' from source: unknown 28983 1726883085.92574: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 28983 1726883085.92639: variable 'ansible_facts' from source: unknown 28983 1726883085.92864: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726883085.8920982-33178-266340757145637/AnsiballZ_systemd.py 28983 1726883085.93136: Sending initial data 28983 1726883085.93148: Sent initial data (156 bytes) 28983 1726883085.93751: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883085.93836: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883085.93861: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883085.93907: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883085.93976: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883085.95706: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726883085.95787: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726883085.95872: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmp8ze5nwvg /root/.ansible/tmp/ansible-tmp-1726883085.8920982-33178-266340757145637/AnsiballZ_systemd.py <<< 28983 1726883085.95876: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726883085.8920982-33178-266340757145637/AnsiballZ_systemd.py" <<< 28983 1726883085.95953: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmp8ze5nwvg" to remote "/root/.ansible/tmp/ansible-tmp-1726883085.8920982-33178-266340757145637/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726883085.8920982-33178-266340757145637/AnsiballZ_systemd.py" <<< 28983 1726883085.98662: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883085.98691: stderr chunk (state=3): >>><<< 28983 1726883085.98694: stdout chunk (state=3): >>><<< 28983 1726883085.98812: done transferring module to remote 28983 1726883085.98816: _low_level_execute_command(): starting 28983 1726883085.98818: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726883085.8920982-33178-266340757145637/ /root/.ansible/tmp/ansible-tmp-1726883085.8920982-33178-266340757145637/AnsiballZ_systemd.py && sleep 0' 28983 1726883085.99344: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883085.99361: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883085.99386: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883085.99408: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883085.99493: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883085.99536: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883085.99552: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883085.99574: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883085.99676: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883086.01643: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883086.01654: stdout chunk (state=3): >>><<< 28983 1726883086.01673: stderr chunk (state=3): >>><<< 28983 1726883086.01775: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883086.01779: _low_level_execute_command(): starting 28983 1726883086.01782: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726883085.8920982-33178-266340757145637/AnsiballZ_systemd.py && sleep 0' 28983 1726883086.02456: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 28983 1726883086.02478: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883086.02504: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883086.02599: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883086.35532: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3307", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ExecMainStartTimestampMonotonic": "237115079", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3307", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4837", "MemoryCurrent": "4534272", "MemoryAvailable": "infinity", "CPUUsageNSec": "1656061000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "<<< 28983 1726883086.35581: stdout chunk (state=3): >>>infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target NetworkManager-wait-online.service cloud-init.service network.target network.service multi-user.target", "After": "systemd-journald.socket dbus-broker.service dbus.socket system.slice cloud-init-local.service sysinit.target network-pre.target basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:40:53 EDT", "StateChangeTimestampMonotonic": "816797668", "InactiveExitTimestamp": "Fri 2024-09-20 21:31:13 EDT", "InactiveExitTimestampMonotonic": "237115438", "ActiveEnterTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ActiveEnterTimestampMonotonic": "237210316", "ActiveExitTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ActiveExitTimestampMonotonic": "237082173", "InactiveEnterTimestamp": "Fri 2024-09-20 21:31:13 EDT", "InactiveEnterTimestampMonotonic": "237109124", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ConditionTimestampMonotonic": "237109810", "AssertTimestamp": "Fri 2024-09-20 21:31:13 EDT", "AssertTimestampMonotonic": "237109813", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ac5f6302898c463683c4bdf580ab7e3e", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 28983 1726883086.37552: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883086.37660: stderr chunk (state=3): >>>Shared connection to 10.31.46.139 closed. <<< 28983 1726883086.37687: stdout chunk (state=3): >>><<< 28983 1726883086.37691: stderr chunk (state=3): >>><<< 28983 1726883086.37712: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3307", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ExecMainStartTimestampMonotonic": "237115079", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3307", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4837", "MemoryCurrent": "4534272", "MemoryAvailable": "infinity", "CPUUsageNSec": "1656061000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target NetworkManager-wait-online.service cloud-init.service network.target network.service multi-user.target", "After": "systemd-journald.socket dbus-broker.service dbus.socket system.slice cloud-init-local.service sysinit.target network-pre.target basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:40:53 EDT", "StateChangeTimestampMonotonic": "816797668", "InactiveExitTimestamp": "Fri 2024-09-20 21:31:13 EDT", "InactiveExitTimestampMonotonic": "237115438", "ActiveEnterTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ActiveEnterTimestampMonotonic": "237210316", "ActiveExitTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ActiveExitTimestampMonotonic": "237082173", "InactiveEnterTimestamp": "Fri 2024-09-20 21:31:13 EDT", "InactiveEnterTimestampMonotonic": "237109124", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ConditionTimestampMonotonic": "237109810", "AssertTimestamp": "Fri 2024-09-20 21:31:13 EDT", "AssertTimestampMonotonic": "237109813", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ac5f6302898c463683c4bdf580ab7e3e", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726883086.38144: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726883085.8920982-33178-266340757145637/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726883086.38148: _low_level_execute_command(): starting 28983 1726883086.38151: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726883085.8920982-33178-266340757145637/ > /dev/null 2>&1 && sleep 0' 28983 1726883086.38742: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883086.38794: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883086.38810: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883086.38850: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883086.38922: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883086.38973: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883086.39052: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883086.41088: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883086.41092: stdout chunk (state=3): >>><<< 28983 1726883086.41096: stderr chunk (state=3): >>><<< 28983 1726883086.41114: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883086.41128: handler run complete 28983 1726883086.41304: attempt loop complete, returning result 28983 1726883086.41307: _execute() done 28983 1726883086.41310: dumping result to json 28983 1726883086.41312: done dumping result, returning 28983 1726883086.41316: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affe814-3a2d-b16d-c0a7-000000001b47] 28983 1726883086.41318: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001b47 28983 1726883086.41910: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001b47 28983 1726883086.41914: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28983 1726883086.41987: no more pending results, returning what we have 28983 1726883086.41990: results queue empty 28983 1726883086.41992: checking for any_errors_fatal 28983 1726883086.41999: done checking for any_errors_fatal 28983 1726883086.42000: checking for max_fail_percentage 28983 1726883086.42002: done checking for max_fail_percentage 28983 1726883086.42003: checking to see if all hosts have failed and the running result is not ok 28983 1726883086.42004: done checking to see if all hosts have failed 28983 1726883086.42005: getting the remaining hosts for this loop 28983 1726883086.42008: done getting the remaining hosts for this loop 28983 1726883086.42014: getting the next task for host managed_node2 28983 1726883086.42022: done getting next task for host managed_node2 28983 1726883086.42027: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 28983 1726883086.42035: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883086.42051: getting variables 28983 1726883086.42053: in VariableManager get_vars() 28983 1726883086.42098: Calling all_inventory to load vars for managed_node2 28983 1726883086.42102: Calling groups_inventory to load vars for managed_node2 28983 1726883086.42106: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883086.42116: Calling all_plugins_play to load vars for managed_node2 28983 1726883086.42120: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883086.42125: Calling groups_plugins_play to load vars for managed_node2 28983 1726883086.45861: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883086.49898: done with get_vars() 28983 1726883086.49948: done getting variables 28983 1726883086.50018: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:44:46 -0400 (0:00:00.726) 0:01:56.498 ****** 28983 1726883086.50076: entering _queue_task() for managed_node2/service 28983 1726883086.50467: worker is 1 (out of 1 available) 28983 1726883086.50482: exiting _queue_task() for managed_node2/service 28983 1726883086.50497: done queuing things up, now waiting for results queue to drain 28983 1726883086.50499: waiting for pending results... 28983 1726883086.50858: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 28983 1726883086.51092: in run() - task 0affe814-3a2d-b16d-c0a7-000000001b48 28983 1726883086.51097: variable 'ansible_search_path' from source: unknown 28983 1726883086.51123: variable 'ansible_search_path' from source: unknown 28983 1726883086.51232: calling self._execute() 28983 1726883086.51298: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883086.51313: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883086.51338: variable 'omit' from source: magic vars 28983 1726883086.51823: variable 'ansible_distribution_major_version' from source: facts 28983 1726883086.51845: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883086.52019: variable 'network_provider' from source: set_fact 28983 1726883086.52106: Evaluated conditional (network_provider == "nm"): True 28983 1726883086.52165: variable '__network_wpa_supplicant_required' from source: role '' defaults 28983 1726883086.52290: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 28983 1726883086.52540: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883086.55644: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883086.55648: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883086.55651: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883086.55782: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883086.55819: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883086.56020: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883086.56119: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883086.56225: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883086.56287: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883086.56428: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883086.56495: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883086.56659: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883086.56694: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883086.56759: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883086.56864: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883086.56921: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883086.57172: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883086.57175: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883086.57248: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883086.57274: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883086.57638: variable 'network_connections' from source: include params 28983 1726883086.57829: variable 'interface' from source: play vars 28983 1726883086.57833: variable 'interface' from source: play vars 28983 1726883086.58014: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883086.58443: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883086.58541: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883086.58637: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883086.58670: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883086.58864: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883086.58894: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883086.58923: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883086.59076: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883086.59229: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883086.59759: variable 'network_connections' from source: include params 28983 1726883086.59769: variable 'interface' from source: play vars 28983 1726883086.59846: variable 'interface' from source: play vars 28983 1726883086.59886: Evaluated conditional (__network_wpa_supplicant_required): False 28983 1726883086.59889: when evaluation is False, skipping this task 28983 1726883086.59892: _execute() done 28983 1726883086.59897: dumping result to json 28983 1726883086.59902: done dumping result, returning 28983 1726883086.59912: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affe814-3a2d-b16d-c0a7-000000001b48] 28983 1726883086.59923: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001b48 28983 1726883086.60239: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001b48 28983 1726883086.60243: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 28983 1726883086.60288: no more pending results, returning what we have 28983 1726883086.60292: results queue empty 28983 1726883086.60293: checking for any_errors_fatal 28983 1726883086.60321: done checking for any_errors_fatal 28983 1726883086.60322: checking for max_fail_percentage 28983 1726883086.60324: done checking for max_fail_percentage 28983 1726883086.60325: checking to see if all hosts have failed and the running result is not ok 28983 1726883086.60326: done checking to see if all hosts have failed 28983 1726883086.60327: getting the remaining hosts for this loop 28983 1726883086.60328: done getting the remaining hosts for this loop 28983 1726883086.60332: getting the next task for host managed_node2 28983 1726883086.60341: done getting next task for host managed_node2 28983 1726883086.60346: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 28983 1726883086.60352: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883086.60380: getting variables 28983 1726883086.60382: in VariableManager get_vars() 28983 1726883086.60424: Calling all_inventory to load vars for managed_node2 28983 1726883086.60428: Calling groups_inventory to load vars for managed_node2 28983 1726883086.60431: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883086.60442: Calling all_plugins_play to load vars for managed_node2 28983 1726883086.60446: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883086.60450: Calling groups_plugins_play to load vars for managed_node2 28983 1726883086.62831: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883086.65284: done with get_vars() 28983 1726883086.65309: done getting variables 28983 1726883086.65361: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:44:46 -0400 (0:00:00.153) 0:01:56.651 ****** 28983 1726883086.65390: entering _queue_task() for managed_node2/service 28983 1726883086.65657: worker is 1 (out of 1 available) 28983 1726883086.65675: exiting _queue_task() for managed_node2/service 28983 1726883086.65690: done queuing things up, now waiting for results queue to drain 28983 1726883086.65692: waiting for pending results... 28983 1726883086.65892: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 28983 1726883086.66008: in run() - task 0affe814-3a2d-b16d-c0a7-000000001b49 28983 1726883086.66023: variable 'ansible_search_path' from source: unknown 28983 1726883086.66029: variable 'ansible_search_path' from source: unknown 28983 1726883086.66064: calling self._execute() 28983 1726883086.66150: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883086.66155: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883086.66170: variable 'omit' from source: magic vars 28983 1726883086.66617: variable 'ansible_distribution_major_version' from source: facts 28983 1726883086.66621: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883086.66763: variable 'network_provider' from source: set_fact 28983 1726883086.66941: Evaluated conditional (network_provider == "initscripts"): False 28983 1726883086.66944: when evaluation is False, skipping this task 28983 1726883086.66946: _execute() done 28983 1726883086.66949: dumping result to json 28983 1726883086.66950: done dumping result, returning 28983 1726883086.66953: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [0affe814-3a2d-b16d-c0a7-000000001b49] 28983 1726883086.66955: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001b49 28983 1726883086.67025: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001b49 28983 1726883086.67028: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28983 1726883086.67088: no more pending results, returning what we have 28983 1726883086.67091: results queue empty 28983 1726883086.67092: checking for any_errors_fatal 28983 1726883086.67099: done checking for any_errors_fatal 28983 1726883086.67100: checking for max_fail_percentage 28983 1726883086.67102: done checking for max_fail_percentage 28983 1726883086.67103: checking to see if all hosts have failed and the running result is not ok 28983 1726883086.67104: done checking to see if all hosts have failed 28983 1726883086.67105: getting the remaining hosts for this loop 28983 1726883086.67106: done getting the remaining hosts for this loop 28983 1726883086.67110: getting the next task for host managed_node2 28983 1726883086.67117: done getting next task for host managed_node2 28983 1726883086.67121: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 28983 1726883086.67127: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883086.67159: getting variables 28983 1726883086.67161: in VariableManager get_vars() 28983 1726883086.67204: Calling all_inventory to load vars for managed_node2 28983 1726883086.67207: Calling groups_inventory to load vars for managed_node2 28983 1726883086.67210: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883086.67219: Calling all_plugins_play to load vars for managed_node2 28983 1726883086.67223: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883086.67227: Calling groups_plugins_play to load vars for managed_node2 28983 1726883086.68773: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883086.70879: done with get_vars() 28983 1726883086.70915: done getting variables 28983 1726883086.70989: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:44:46 -0400 (0:00:00.056) 0:01:56.708 ****** 28983 1726883086.71029: entering _queue_task() for managed_node2/copy 28983 1726883086.71570: worker is 1 (out of 1 available) 28983 1726883086.71584: exiting _queue_task() for managed_node2/copy 28983 1726883086.71596: done queuing things up, now waiting for results queue to drain 28983 1726883086.71598: waiting for pending results... 28983 1726883086.71838: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 28983 1726883086.71920: in run() - task 0affe814-3a2d-b16d-c0a7-000000001b4a 28983 1726883086.71952: variable 'ansible_search_path' from source: unknown 28983 1726883086.71962: variable 'ansible_search_path' from source: unknown 28983 1726883086.72010: calling self._execute() 28983 1726883086.72133: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883086.72154: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883086.72174: variable 'omit' from source: magic vars 28983 1726883086.72697: variable 'ansible_distribution_major_version' from source: facts 28983 1726883086.72701: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883086.72823: variable 'network_provider' from source: set_fact 28983 1726883086.72848: Evaluated conditional (network_provider == "initscripts"): False 28983 1726883086.72857: when evaluation is False, skipping this task 28983 1726883086.72864: _execute() done 28983 1726883086.72874: dumping result to json 28983 1726883086.72885: done dumping result, returning 28983 1726883086.72899: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affe814-3a2d-b16d-c0a7-000000001b4a] 28983 1726883086.72915: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001b4a skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 28983 1726883086.73212: no more pending results, returning what we have 28983 1726883086.73220: results queue empty 28983 1726883086.73221: checking for any_errors_fatal 28983 1726883086.73228: done checking for any_errors_fatal 28983 1726883086.73232: checking for max_fail_percentage 28983 1726883086.73235: done checking for max_fail_percentage 28983 1726883086.73237: checking to see if all hosts have failed and the running result is not ok 28983 1726883086.73238: done checking to see if all hosts have failed 28983 1726883086.73239: getting the remaining hosts for this loop 28983 1726883086.73241: done getting the remaining hosts for this loop 28983 1726883086.73246: getting the next task for host managed_node2 28983 1726883086.73255: done getting next task for host managed_node2 28983 1726883086.73261: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 28983 1726883086.73269: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883086.73306: getting variables 28983 1726883086.73308: in VariableManager get_vars() 28983 1726883086.73524: Calling all_inventory to load vars for managed_node2 28983 1726883086.73528: Calling groups_inventory to load vars for managed_node2 28983 1726883086.73531: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883086.73541: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001b4a 28983 1726883086.73544: WORKER PROCESS EXITING 28983 1726883086.73553: Calling all_plugins_play to load vars for managed_node2 28983 1726883086.73557: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883086.73561: Calling groups_plugins_play to load vars for managed_node2 28983 1726883086.76204: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883086.80226: done with get_vars() 28983 1726883086.80547: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:44:46 -0400 (0:00:00.096) 0:01:56.804 ****** 28983 1726883086.80656: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 28983 1726883086.81193: worker is 1 (out of 1 available) 28983 1726883086.81210: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 28983 1726883086.81225: done queuing things up, now waiting for results queue to drain 28983 1726883086.81227: waiting for pending results... 28983 1726883086.81829: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 28983 1726883086.82150: in run() - task 0affe814-3a2d-b16d-c0a7-000000001b4b 28983 1726883086.82172: variable 'ansible_search_path' from source: unknown 28983 1726883086.82176: variable 'ansible_search_path' from source: unknown 28983 1726883086.82274: calling self._execute() 28983 1726883086.82544: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883086.82552: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883086.82567: variable 'omit' from source: magic vars 28983 1726883086.83624: variable 'ansible_distribution_major_version' from source: facts 28983 1726883086.83715: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883086.83719: variable 'omit' from source: magic vars 28983 1726883086.83860: variable 'omit' from source: magic vars 28983 1726883086.84200: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883086.90010: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883086.90104: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883086.90286: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883086.90330: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883086.90663: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883086.90701: variable 'network_provider' from source: set_fact 28983 1726883086.91055: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883086.91169: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883086.91269: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883086.91379: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883086.91397: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883086.91705: variable 'omit' from source: magic vars 28983 1726883086.91970: variable 'omit' from source: magic vars 28983 1726883086.92227: variable 'network_connections' from source: include params 28983 1726883086.92291: variable 'interface' from source: play vars 28983 1726883086.92482: variable 'interface' from source: play vars 28983 1726883086.92951: variable 'omit' from source: magic vars 28983 1726883086.92955: variable '__lsr_ansible_managed' from source: task vars 28983 1726883086.93060: variable '__lsr_ansible_managed' from source: task vars 28983 1726883086.93583: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 28983 1726883086.94205: Loaded config def from plugin (lookup/template) 28983 1726883086.94210: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 28983 1726883086.94368: File lookup term: get_ansible_managed.j2 28983 1726883086.94372: variable 'ansible_search_path' from source: unknown 28983 1726883086.94375: evaluation_path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 28983 1726883086.94380: search_path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 28983 1726883086.94383: variable 'ansible_search_path' from source: unknown 28983 1726883087.19042: variable 'ansible_managed' from source: unknown 28983 1726883087.19375: variable 'omit' from source: magic vars 28983 1726883087.19639: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883087.19643: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883087.19646: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883087.19648: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883087.19650: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883087.19764: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883087.19768: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883087.19779: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883087.20017: Set connection var ansible_connection to ssh 28983 1726883087.20032: Set connection var ansible_shell_executable to /bin/sh 28983 1726883087.20114: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883087.20125: Set connection var ansible_timeout to 10 28983 1726883087.20133: Set connection var ansible_pipelining to False 28983 1726883087.20137: Set connection var ansible_shell_type to sh 28983 1726883087.20217: variable 'ansible_shell_executable' from source: unknown 28983 1726883087.20220: variable 'ansible_connection' from source: unknown 28983 1726883087.20225: variable 'ansible_module_compression' from source: unknown 28983 1726883087.20229: variable 'ansible_shell_type' from source: unknown 28983 1726883087.20235: variable 'ansible_shell_executable' from source: unknown 28983 1726883087.20239: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883087.20245: variable 'ansible_pipelining' from source: unknown 28983 1726883087.20249: variable 'ansible_timeout' from source: unknown 28983 1726883087.20255: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883087.20839: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28983 1726883087.20851: variable 'omit' from source: magic vars 28983 1726883087.20853: starting attempt loop 28983 1726883087.20860: running the handler 28983 1726883087.20862: _low_level_execute_command(): starting 28983 1726883087.20864: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726883087.22226: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883087.22296: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883087.22308: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883087.22325: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883087.22551: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883087.22574: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883087.22591: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883087.22839: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883087.22843: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883087.24652: stdout chunk (state=3): >>>/root <<< 28983 1726883087.24750: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883087.24884: stderr chunk (state=3): >>><<< 28983 1726883087.24892: stdout chunk (state=3): >>><<< 28983 1726883087.24927: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883087.24941: _low_level_execute_command(): starting 28983 1726883087.24948: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726883087.2492752-33213-215870521070184 `" && echo ansible-tmp-1726883087.2492752-33213-215870521070184="` echo /root/.ansible/tmp/ansible-tmp-1726883087.2492752-33213-215870521070184 `" ) && sleep 0' 28983 1726883087.26123: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883087.26151: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883087.26169: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883087.26190: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883087.26230: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726883087.26240: stderr chunk (state=3): >>>debug2: match not found <<< 28983 1726883087.26252: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883087.26392: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883087.26467: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883087.26518: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883087.26521: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883087.26844: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883087.28696: stdout chunk (state=3): >>>ansible-tmp-1726883087.2492752-33213-215870521070184=/root/.ansible/tmp/ansible-tmp-1726883087.2492752-33213-215870521070184 <<< 28983 1726883087.29000: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883087.29003: stdout chunk (state=3): >>><<< 28983 1726883087.29012: stderr chunk (state=3): >>><<< 28983 1726883087.29032: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726883087.2492752-33213-215870521070184=/root/.ansible/tmp/ansible-tmp-1726883087.2492752-33213-215870521070184 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883087.29098: variable 'ansible_module_compression' from source: unknown 28983 1726883087.29148: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 28983 1726883087.29317: variable 'ansible_facts' from source: unknown 28983 1726883087.29575: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726883087.2492752-33213-215870521070184/AnsiballZ_network_connections.py 28983 1726883087.30166: Sending initial data 28983 1726883087.30170: Sent initial data (168 bytes) 28983 1726883087.31111: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883087.31115: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883087.31121: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883087.31124: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883087.31126: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883087.31293: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883087.31305: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883087.31328: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883087.31549: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883087.33142: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726883087.33205: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726883087.33676: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmptjjqprqw /root/.ansible/tmp/ansible-tmp-1726883087.2492752-33213-215870521070184/AnsiballZ_network_connections.py <<< 28983 1726883087.33681: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726883087.2492752-33213-215870521070184/AnsiballZ_network_connections.py" <<< 28983 1726883087.33743: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmptjjqprqw" to remote "/root/.ansible/tmp/ansible-tmp-1726883087.2492752-33213-215870521070184/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726883087.2492752-33213-215870521070184/AnsiballZ_network_connections.py" <<< 28983 1726883087.36755: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883087.37048: stderr chunk (state=3): >>><<< 28983 1726883087.37052: stdout chunk (state=3): >>><<< 28983 1726883087.37054: done transferring module to remote 28983 1726883087.37057: _low_level_execute_command(): starting 28983 1726883087.37060: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726883087.2492752-33213-215870521070184/ /root/.ansible/tmp/ansible-tmp-1726883087.2492752-33213-215870521070184/AnsiballZ_network_connections.py && sleep 0' 28983 1726883087.37956: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883087.37970: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883087.38240: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883087.38247: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883087.38353: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883087.38379: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883087.38602: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883087.40588: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883087.40591: stdout chunk (state=3): >>><<< 28983 1726883087.40600: stderr chunk (state=3): >>><<< 28983 1726883087.40617: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883087.40622: _low_level_execute_command(): starting 28983 1726883087.40628: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726883087.2492752-33213-215870521070184/AnsiballZ_network_connections.py && sleep 0' 28983 1726883087.41640: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883087.41949: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883087.41961: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883087.41976: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883087.42013: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883087.42269: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883087.72065: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_s3l94ua4/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_s3l94ua4/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on statebr/2ca3cca4-edb7-40a1-9de5-195b63d4908d: error=unknown <<< 28983 1726883087.72283: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 28983 1726883087.74250: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883087.74253: stderr chunk (state=3): >>>Shared connection to 10.31.46.139 closed. <<< 28983 1726883087.74483: stderr chunk (state=3): >>><<< 28983 1726883087.74487: stdout chunk (state=3): >>><<< 28983 1726883087.74490: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_s3l94ua4/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_s3l94ua4/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on statebr/2ca3cca4-edb7-40a1-9de5-195b63d4908d: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726883087.74511: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'persistent_state': 'absent'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726883087.2492752-33213-215870521070184/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726883087.74523: _low_level_execute_command(): starting 28983 1726883087.74591: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726883087.2492752-33213-215870521070184/ > /dev/null 2>&1 && sleep 0' 28983 1726883087.76119: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883087.76228: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883087.76391: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883087.76454: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883087.76493: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883087.76919: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883087.78658: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883087.78814: stderr chunk (state=3): >>><<< 28983 1726883087.78828: stdout chunk (state=3): >>><<< 28983 1726883087.78846: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883087.78854: handler run complete 28983 1726883087.78893: attempt loop complete, returning result 28983 1726883087.78897: _execute() done 28983 1726883087.78900: dumping result to json 28983 1726883087.78905: done dumping result, returning 28983 1726883087.78917: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affe814-3a2d-b16d-c0a7-000000001b4b] 28983 1726883087.78923: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001b4b 28983 1726883087.79369: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001b4b 28983 1726883087.79377: WORKER PROCESS EXITING changed: [managed_node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 28983 1726883087.79556: no more pending results, returning what we have 28983 1726883087.79561: results queue empty 28983 1726883087.79562: checking for any_errors_fatal 28983 1726883087.79570: done checking for any_errors_fatal 28983 1726883087.79573: checking for max_fail_percentage 28983 1726883087.79575: done checking for max_fail_percentage 28983 1726883087.79576: checking to see if all hosts have failed and the running result is not ok 28983 1726883087.79577: done checking to see if all hosts have failed 28983 1726883087.79578: getting the remaining hosts for this loop 28983 1726883087.79580: done getting the remaining hosts for this loop 28983 1726883087.79585: getting the next task for host managed_node2 28983 1726883087.79593: done getting next task for host managed_node2 28983 1726883087.79598: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 28983 1726883087.79603: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883087.79618: getting variables 28983 1726883087.79620: in VariableManager get_vars() 28983 1726883087.80075: Calling all_inventory to load vars for managed_node2 28983 1726883087.80079: Calling groups_inventory to load vars for managed_node2 28983 1726883087.80083: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883087.80095: Calling all_plugins_play to load vars for managed_node2 28983 1726883087.80104: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883087.80109: Calling groups_plugins_play to load vars for managed_node2 28983 1726883087.84654: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883087.88337: done with get_vars() 28983 1726883087.88380: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:44:47 -0400 (0:00:01.078) 0:01:57.882 ****** 28983 1726883087.88499: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 28983 1726883087.88910: worker is 1 (out of 1 available) 28983 1726883087.88924: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 28983 1726883087.88941: done queuing things up, now waiting for results queue to drain 28983 1726883087.88943: waiting for pending results... 28983 1726883087.89474: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 28983 1726883087.89488: in run() - task 0affe814-3a2d-b16d-c0a7-000000001b4c 28983 1726883087.89514: variable 'ansible_search_path' from source: unknown 28983 1726883087.89524: variable 'ansible_search_path' from source: unknown 28983 1726883087.89677: calling self._execute() 28983 1726883087.89726: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883087.89742: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883087.89761: variable 'omit' from source: magic vars 28983 1726883087.90369: variable 'ansible_distribution_major_version' from source: facts 28983 1726883087.90399: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883087.90601: variable 'network_state' from source: role '' defaults 28983 1726883087.90620: Evaluated conditional (network_state != {}): False 28983 1726883087.90629: when evaluation is False, skipping this task 28983 1726883087.90640: _execute() done 28983 1726883087.90654: dumping result to json 28983 1726883087.90740: done dumping result, returning 28983 1726883087.90744: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [0affe814-3a2d-b16d-c0a7-000000001b4c] 28983 1726883087.90746: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001b4c 28983 1726883087.91042: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001b4c 28983 1726883087.91046: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28983 1726883087.91108: no more pending results, returning what we have 28983 1726883087.91113: results queue empty 28983 1726883087.91114: checking for any_errors_fatal 28983 1726883087.91124: done checking for any_errors_fatal 28983 1726883087.91125: checking for max_fail_percentage 28983 1726883087.91127: done checking for max_fail_percentage 28983 1726883087.91128: checking to see if all hosts have failed and the running result is not ok 28983 1726883087.91129: done checking to see if all hosts have failed 28983 1726883087.91130: getting the remaining hosts for this loop 28983 1726883087.91132: done getting the remaining hosts for this loop 28983 1726883087.91139: getting the next task for host managed_node2 28983 1726883087.91148: done getting next task for host managed_node2 28983 1726883087.91153: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 28983 1726883087.91160: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883087.91192: getting variables 28983 1726883087.91194: in VariableManager get_vars() 28983 1726883087.91659: Calling all_inventory to load vars for managed_node2 28983 1726883087.91663: Calling groups_inventory to load vars for managed_node2 28983 1726883087.91666: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883087.91678: Calling all_plugins_play to load vars for managed_node2 28983 1726883087.91683: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883087.91687: Calling groups_plugins_play to load vars for managed_node2 28983 1726883087.96564: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883088.02643: done with get_vars() 28983 1726883088.02697: done getting variables 28983 1726883088.02783: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:44:48 -0400 (0:00:00.143) 0:01:58.026 ****** 28983 1726883088.02828: entering _queue_task() for managed_node2/debug 28983 1726883088.03266: worker is 1 (out of 1 available) 28983 1726883088.03400: exiting _queue_task() for managed_node2/debug 28983 1726883088.03414: done queuing things up, now waiting for results queue to drain 28983 1726883088.03416: waiting for pending results... 28983 1726883088.03748: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 28983 1726883088.04139: in run() - task 0affe814-3a2d-b16d-c0a7-000000001b4d 28983 1726883088.04175: variable 'ansible_search_path' from source: unknown 28983 1726883088.04192: variable 'ansible_search_path' from source: unknown 28983 1726883088.04242: calling self._execute() 28983 1726883088.04394: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883088.04405: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883088.04440: variable 'omit' from source: magic vars 28983 1726883088.04927: variable 'ansible_distribution_major_version' from source: facts 28983 1726883088.04957: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883088.05040: variable 'omit' from source: magic vars 28983 1726883088.05078: variable 'omit' from source: magic vars 28983 1726883088.05126: variable 'omit' from source: magic vars 28983 1726883088.05196: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883088.05245: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883088.05291: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883088.05319: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883088.05337: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883088.05407: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883088.05486: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883088.05489: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883088.05605: Set connection var ansible_connection to ssh 28983 1726883088.05625: Set connection var ansible_shell_executable to /bin/sh 28983 1726883088.05644: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883088.05703: Set connection var ansible_timeout to 10 28983 1726883088.05706: Set connection var ansible_pipelining to False 28983 1726883088.05710: Set connection var ansible_shell_type to sh 28983 1726883088.05713: variable 'ansible_shell_executable' from source: unknown 28983 1726883088.05939: variable 'ansible_connection' from source: unknown 28983 1726883088.05943: variable 'ansible_module_compression' from source: unknown 28983 1726883088.05946: variable 'ansible_shell_type' from source: unknown 28983 1726883088.05948: variable 'ansible_shell_executable' from source: unknown 28983 1726883088.05951: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883088.05953: variable 'ansible_pipelining' from source: unknown 28983 1726883088.05955: variable 'ansible_timeout' from source: unknown 28983 1726883088.05957: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883088.05960: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883088.05976: variable 'omit' from source: magic vars 28983 1726883088.05988: starting attempt loop 28983 1726883088.05997: running the handler 28983 1726883088.06166: variable '__network_connections_result' from source: set_fact 28983 1726883088.06243: handler run complete 28983 1726883088.06276: attempt loop complete, returning result 28983 1726883088.06285: _execute() done 28983 1726883088.06304: dumping result to json 28983 1726883088.06315: done dumping result, returning 28983 1726883088.06329: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affe814-3a2d-b16d-c0a7-000000001b4d] 28983 1726883088.06411: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001b4d 28983 1726883088.06490: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001b4d 28983 1726883088.06494: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result.stderr_lines": [ "" ] } 28983 1726883088.06610: no more pending results, returning what we have 28983 1726883088.06614: results queue empty 28983 1726883088.06615: checking for any_errors_fatal 28983 1726883088.06633: done checking for any_errors_fatal 28983 1726883088.06637: checking for max_fail_percentage 28983 1726883088.06639: done checking for max_fail_percentage 28983 1726883088.06641: checking to see if all hosts have failed and the running result is not ok 28983 1726883088.06642: done checking to see if all hosts have failed 28983 1726883088.06643: getting the remaining hosts for this loop 28983 1726883088.06645: done getting the remaining hosts for this loop 28983 1726883088.06652: getting the next task for host managed_node2 28983 1726883088.06661: done getting next task for host managed_node2 28983 1726883088.06666: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 28983 1726883088.06676: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883088.06694: getting variables 28983 1726883088.06696: in VariableManager get_vars() 28983 1726883088.06863: Calling all_inventory to load vars for managed_node2 28983 1726883088.06867: Calling groups_inventory to load vars for managed_node2 28983 1726883088.06870: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883088.06884: Calling all_plugins_play to load vars for managed_node2 28983 1726883088.06888: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883088.06893: Calling groups_plugins_play to load vars for managed_node2 28983 1726883088.09797: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883088.12980: done with get_vars() 28983 1726883088.13026: done getting variables 28983 1726883088.13104: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:44:48 -0400 (0:00:00.103) 0:01:58.129 ****** 28983 1726883088.13159: entering _queue_task() for managed_node2/debug 28983 1726883088.13596: worker is 1 (out of 1 available) 28983 1726883088.13611: exiting _queue_task() for managed_node2/debug 28983 1726883088.13625: done queuing things up, now waiting for results queue to drain 28983 1726883088.13627: waiting for pending results... 28983 1726883088.14061: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 28983 1726883088.14190: in run() - task 0affe814-3a2d-b16d-c0a7-000000001b4e 28983 1726883088.14214: variable 'ansible_search_path' from source: unknown 28983 1726883088.14223: variable 'ansible_search_path' from source: unknown 28983 1726883088.14389: calling self._execute() 28983 1726883088.14415: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883088.14429: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883088.14501: variable 'omit' from source: magic vars 28983 1726883088.14982: variable 'ansible_distribution_major_version' from source: facts 28983 1726883088.15002: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883088.15016: variable 'omit' from source: magic vars 28983 1726883088.15116: variable 'omit' from source: magic vars 28983 1726883088.15185: variable 'omit' from source: magic vars 28983 1726883088.15248: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883088.15314: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883088.15382: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883088.15396: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883088.15488: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883088.15495: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883088.15499: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883088.15501: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883088.15647: Set connection var ansible_connection to ssh 28983 1726883088.15669: Set connection var ansible_shell_executable to /bin/sh 28983 1726883088.15692: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883088.15724: Set connection var ansible_timeout to 10 28983 1726883088.15739: Set connection var ansible_pipelining to False 28983 1726883088.15749: Set connection var ansible_shell_type to sh 28983 1726883088.15784: variable 'ansible_shell_executable' from source: unknown 28983 1726883088.15812: variable 'ansible_connection' from source: unknown 28983 1726883088.15818: variable 'ansible_module_compression' from source: unknown 28983 1726883088.15824: variable 'ansible_shell_type' from source: unknown 28983 1726883088.15826: variable 'ansible_shell_executable' from source: unknown 28983 1726883088.15923: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883088.15926: variable 'ansible_pipelining' from source: unknown 28983 1726883088.15931: variable 'ansible_timeout' from source: unknown 28983 1726883088.15933: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883088.16141: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883088.16148: variable 'omit' from source: magic vars 28983 1726883088.16151: starting attempt loop 28983 1726883088.16153: running the handler 28983 1726883088.16191: variable '__network_connections_result' from source: set_fact 28983 1726883088.16314: variable '__network_connections_result' from source: set_fact 28983 1726883088.16476: handler run complete 28983 1726883088.16524: attempt loop complete, returning result 28983 1726883088.16532: _execute() done 28983 1726883088.16542: dumping result to json 28983 1726883088.16552: done dumping result, returning 28983 1726883088.16564: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affe814-3a2d-b16d-c0a7-000000001b4e] 28983 1726883088.16583: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001b4e ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 28983 1726883088.17068: no more pending results, returning what we have 28983 1726883088.17074: results queue empty 28983 1726883088.17075: checking for any_errors_fatal 28983 1726883088.17082: done checking for any_errors_fatal 28983 1726883088.17083: checking for max_fail_percentage 28983 1726883088.17084: done checking for max_fail_percentage 28983 1726883088.17086: checking to see if all hosts have failed and the running result is not ok 28983 1726883088.17087: done checking to see if all hosts have failed 28983 1726883088.17088: getting the remaining hosts for this loop 28983 1726883088.17090: done getting the remaining hosts for this loop 28983 1726883088.17094: getting the next task for host managed_node2 28983 1726883088.17104: done getting next task for host managed_node2 28983 1726883088.17108: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 28983 1726883088.17114: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883088.17130: getting variables 28983 1726883088.17132: in VariableManager get_vars() 28983 1726883088.17297: Calling all_inventory to load vars for managed_node2 28983 1726883088.17301: Calling groups_inventory to load vars for managed_node2 28983 1726883088.17304: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883088.17311: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001b4e 28983 1726883088.17314: WORKER PROCESS EXITING 28983 1726883088.17323: Calling all_plugins_play to load vars for managed_node2 28983 1726883088.17327: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883088.17331: Calling groups_plugins_play to load vars for managed_node2 28983 1726883088.27651: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883088.31829: done with get_vars() 28983 1726883088.31874: done getting variables 28983 1726883088.31941: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:44:48 -0400 (0:00:00.188) 0:01:58.317 ****** 28983 1726883088.31985: entering _queue_task() for managed_node2/debug 28983 1726883088.32504: worker is 1 (out of 1 available) 28983 1726883088.32518: exiting _queue_task() for managed_node2/debug 28983 1726883088.32541: done queuing things up, now waiting for results queue to drain 28983 1726883088.32545: waiting for pending results... 28983 1726883088.33060: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 28983 1726883088.33539: in run() - task 0affe814-3a2d-b16d-c0a7-000000001b4f 28983 1726883088.33566: variable 'ansible_search_path' from source: unknown 28983 1726883088.33636: variable 'ansible_search_path' from source: unknown 28983 1726883088.33742: calling self._execute() 28983 1726883088.33946: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883088.34001: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883088.34064: variable 'omit' from source: magic vars 28983 1726883088.35114: variable 'ansible_distribution_major_version' from source: facts 28983 1726883088.35196: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883088.35570: variable 'network_state' from source: role '' defaults 28983 1726883088.35633: Evaluated conditional (network_state != {}): False 28983 1726883088.35837: when evaluation is False, skipping this task 28983 1726883088.35842: _execute() done 28983 1726883088.35845: dumping result to json 28983 1726883088.35848: done dumping result, returning 28983 1726883088.35852: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affe814-3a2d-b16d-c0a7-000000001b4f] 28983 1726883088.35856: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001b4f 28983 1726883088.35938: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001b4f 28983 1726883088.35942: WORKER PROCESS EXITING skipping: [managed_node2] => { "false_condition": "network_state != {}" } 28983 1726883088.36001: no more pending results, returning what we have 28983 1726883088.36005: results queue empty 28983 1726883088.36006: checking for any_errors_fatal 28983 1726883088.36020: done checking for any_errors_fatal 28983 1726883088.36021: checking for max_fail_percentage 28983 1726883088.36024: done checking for max_fail_percentage 28983 1726883088.36025: checking to see if all hosts have failed and the running result is not ok 28983 1726883088.36026: done checking to see if all hosts have failed 28983 1726883088.36027: getting the remaining hosts for this loop 28983 1726883088.36029: done getting the remaining hosts for this loop 28983 1726883088.36036: getting the next task for host managed_node2 28983 1726883088.36046: done getting next task for host managed_node2 28983 1726883088.36051: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 28983 1726883088.36057: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883088.36098: getting variables 28983 1726883088.36100: in VariableManager get_vars() 28983 1726883088.36453: Calling all_inventory to load vars for managed_node2 28983 1726883088.36457: Calling groups_inventory to load vars for managed_node2 28983 1726883088.36460: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883088.36547: Calling all_plugins_play to load vars for managed_node2 28983 1726883088.36552: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883088.36556: Calling groups_plugins_play to load vars for managed_node2 28983 1726883088.41010: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883088.45930: done with get_vars() 28983 1726883088.46174: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:44:48 -0400 (0:00:00.143) 0:01:58.460 ****** 28983 1726883088.46294: entering _queue_task() for managed_node2/ping 28983 1726883088.46973: worker is 1 (out of 1 available) 28983 1726883088.46988: exiting _queue_task() for managed_node2/ping 28983 1726883088.47004: done queuing things up, now waiting for results queue to drain 28983 1726883088.47006: waiting for pending results... 28983 1726883088.47994: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 28983 1726883088.48307: in run() - task 0affe814-3a2d-b16d-c0a7-000000001b50 28983 1726883088.48313: variable 'ansible_search_path' from source: unknown 28983 1726883088.48316: variable 'ansible_search_path' from source: unknown 28983 1726883088.48542: calling self._execute() 28983 1726883088.48662: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883088.48679: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883088.48752: variable 'omit' from source: magic vars 28983 1726883088.49792: variable 'ansible_distribution_major_version' from source: facts 28983 1726883088.50054: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883088.50059: variable 'omit' from source: magic vars 28983 1726883088.50063: variable 'omit' from source: magic vars 28983 1726883088.50193: variable 'omit' from source: magic vars 28983 1726883088.50441: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883088.50445: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883088.50474: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883088.50528: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883088.50641: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883088.50668: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883088.50722: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883088.50735: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883088.51039: Set connection var ansible_connection to ssh 28983 1726883088.51253: Set connection var ansible_shell_executable to /bin/sh 28983 1726883088.51258: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883088.51261: Set connection var ansible_timeout to 10 28983 1726883088.51263: Set connection var ansible_pipelining to False 28983 1726883088.51266: Set connection var ansible_shell_type to sh 28983 1726883088.51269: variable 'ansible_shell_executable' from source: unknown 28983 1726883088.51275: variable 'ansible_connection' from source: unknown 28983 1726883088.51278: variable 'ansible_module_compression' from source: unknown 28983 1726883088.51281: variable 'ansible_shell_type' from source: unknown 28983 1726883088.51400: variable 'ansible_shell_executable' from source: unknown 28983 1726883088.51404: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883088.51406: variable 'ansible_pipelining' from source: unknown 28983 1726883088.51408: variable 'ansible_timeout' from source: unknown 28983 1726883088.51411: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883088.51880: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28983 1726883088.51960: variable 'omit' from source: magic vars 28983 1726883088.51976: starting attempt loop 28983 1726883088.52041: running the handler 28983 1726883088.52242: _low_level_execute_command(): starting 28983 1726883088.52245: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726883088.53242: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883088.53277: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883088.53306: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883088.53413: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883088.55547: stdout chunk (state=3): >>>/root <<< 28983 1726883088.55551: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883088.55562: stdout chunk (state=3): >>><<< 28983 1726883088.55565: stderr chunk (state=3): >>><<< 28983 1726883088.55592: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883088.55604: _low_level_execute_command(): starting 28983 1726883088.55611: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726883088.555894-33260-85122793192321 `" && echo ansible-tmp-1726883088.555894-33260-85122793192321="` echo /root/.ansible/tmp/ansible-tmp-1726883088.555894-33260-85122793192321 `" ) && sleep 0' 28983 1726883088.56452: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883088.56481: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883088.56533: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883088.56543: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883088.56631: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883088.58681: stdout chunk (state=3): >>>ansible-tmp-1726883088.555894-33260-85122793192321=/root/.ansible/tmp/ansible-tmp-1726883088.555894-33260-85122793192321 <<< 28983 1726883088.58796: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883088.58940: stderr chunk (state=3): >>><<< 28983 1726883088.58943: stdout chunk (state=3): >>><<< 28983 1726883088.58947: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726883088.555894-33260-85122793192321=/root/.ansible/tmp/ansible-tmp-1726883088.555894-33260-85122793192321 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883088.58949: variable 'ansible_module_compression' from source: unknown 28983 1726883088.59067: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 28983 1726883088.59072: variable 'ansible_facts' from source: unknown 28983 1726883088.59339: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726883088.555894-33260-85122793192321/AnsiballZ_ping.py 28983 1726883088.59383: Sending initial data 28983 1726883088.59387: Sent initial data (151 bytes) 28983 1726883088.60250: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883088.60259: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883088.60298: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883088.60368: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883088.62031: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726883088.62192: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726883088.62218: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmp2hnh7ajp /root/.ansible/tmp/ansible-tmp-1726883088.555894-33260-85122793192321/AnsiballZ_ping.py <<< 28983 1726883088.62224: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726883088.555894-33260-85122793192321/AnsiballZ_ping.py" <<< 28983 1726883088.62775: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmp2hnh7ajp" to remote "/root/.ansible/tmp/ansible-tmp-1726883088.555894-33260-85122793192321/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726883088.555894-33260-85122793192321/AnsiballZ_ping.py" <<< 28983 1726883088.64345: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883088.64349: stdout chunk (state=3): >>><<< 28983 1726883088.64358: stderr chunk (state=3): >>><<< 28983 1726883088.64387: done transferring module to remote 28983 1726883088.64400: _low_level_execute_command(): starting 28983 1726883088.64407: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726883088.555894-33260-85122793192321/ /root/.ansible/tmp/ansible-tmp-1726883088.555894-33260-85122793192321/AnsiballZ_ping.py && sleep 0' 28983 1726883088.65231: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883088.65244: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883088.65258: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883088.65283: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883088.65384: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883088.65415: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883088.65426: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883088.65430: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883088.65541: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883088.67822: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883088.67826: stderr chunk (state=3): >>><<< 28983 1726883088.67828: stdout chunk (state=3): >>><<< 28983 1726883088.67831: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883088.67840: _low_level_execute_command(): starting 28983 1726883088.67843: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726883088.555894-33260-85122793192321/AnsiballZ_ping.py && sleep 0' 28983 1726883088.69070: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883088.69079: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883088.69091: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883088.69109: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883088.69122: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726883088.69137: stderr chunk (state=3): >>>debug2: match not found <<< 28983 1726883088.69142: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883088.69281: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883088.69293: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883088.69418: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883088.86400: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 28983 1726883088.87792: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. <<< 28983 1726883088.87841: stderr chunk (state=3): >>><<< 28983 1726883088.87845: stdout chunk (state=3): >>><<< 28983 1726883088.87863: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726883088.87887: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726883088.555894-33260-85122793192321/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726883088.87898: _low_level_execute_command(): starting 28983 1726883088.87904: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726883088.555894-33260-85122793192321/ > /dev/null 2>&1 && sleep 0' 28983 1726883088.88505: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883088.88546: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726883088.88549: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883088.88552: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883088.88554: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883088.88616: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883088.88620: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883088.88691: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883088.90630: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883088.90701: stderr chunk (state=3): >>><<< 28983 1726883088.90705: stdout chunk (state=3): >>><<< 28983 1726883088.90753: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883088.90757: handler run complete 28983 1726883088.90782: attempt loop complete, returning result 28983 1726883088.90788: _execute() done 28983 1726883088.90792: dumping result to json 28983 1726883088.90794: done dumping result, returning 28983 1726883088.90804: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affe814-3a2d-b16d-c0a7-000000001b50] 28983 1726883088.90811: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001b50 ok: [managed_node2] => { "changed": false, "ping": "pong" } 28983 1726883088.91028: no more pending results, returning what we have 28983 1726883088.91033: results queue empty 28983 1726883088.91036: checking for any_errors_fatal 28983 1726883088.91043: done checking for any_errors_fatal 28983 1726883088.91044: checking for max_fail_percentage 28983 1726883088.91046: done checking for max_fail_percentage 28983 1726883088.91048: checking to see if all hosts have failed and the running result is not ok 28983 1726883088.91049: done checking to see if all hosts have failed 28983 1726883088.91050: getting the remaining hosts for this loop 28983 1726883088.91052: done getting the remaining hosts for this loop 28983 1726883088.91058: getting the next task for host managed_node2 28983 1726883088.91070: done getting next task for host managed_node2 28983 1726883088.91075: ^ task is: TASK: meta (role_complete) 28983 1726883088.91081: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883088.91096: getting variables 28983 1726883088.91098: in VariableManager get_vars() 28983 1726883088.91159: Calling all_inventory to load vars for managed_node2 28983 1726883088.91164: Calling groups_inventory to load vars for managed_node2 28983 1726883088.91167: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883088.91183: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001b50 28983 1726883088.91187: WORKER PROCESS EXITING 28983 1726883088.91199: Calling all_plugins_play to load vars for managed_node2 28983 1726883088.91203: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883088.91207: Calling groups_plugins_play to load vars for managed_node2 28983 1726883088.93798: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883088.95485: done with get_vars() 28983 1726883088.95510: done getting variables 28983 1726883088.95580: done queuing things up, now waiting for results queue to drain 28983 1726883088.95582: results queue empty 28983 1726883088.95582: checking for any_errors_fatal 28983 1726883088.95584: done checking for any_errors_fatal 28983 1726883088.95585: checking for max_fail_percentage 28983 1726883088.95586: done checking for max_fail_percentage 28983 1726883088.95586: checking to see if all hosts have failed and the running result is not ok 28983 1726883088.95587: done checking to see if all hosts have failed 28983 1726883088.95587: getting the remaining hosts for this loop 28983 1726883088.95588: done getting the remaining hosts for this loop 28983 1726883088.95590: getting the next task for host managed_node2 28983 1726883088.95595: done getting next task for host managed_node2 28983 1726883088.95597: ^ task is: TASK: Test 28983 1726883088.95599: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883088.95602: getting variables 28983 1726883088.95603: in VariableManager get_vars() 28983 1726883088.95612: Calling all_inventory to load vars for managed_node2 28983 1726883088.95613: Calling groups_inventory to load vars for managed_node2 28983 1726883088.95615: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883088.95619: Calling all_plugins_play to load vars for managed_node2 28983 1726883088.95622: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883088.95626: Calling groups_plugins_play to load vars for managed_node2 28983 1726883088.97261: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883088.99401: done with get_vars() 28983 1726883088.99422: done getting variables TASK [Test] ******************************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:30 Friday 20 September 2024 21:44:48 -0400 (0:00:00.532) 0:01:58.992 ****** 28983 1726883088.99501: entering _queue_task() for managed_node2/include_tasks 28983 1726883088.99811: worker is 1 (out of 1 available) 28983 1726883088.99826: exiting _queue_task() for managed_node2/include_tasks 28983 1726883088.99843: done queuing things up, now waiting for results queue to drain 28983 1726883088.99845: waiting for pending results... 28983 1726883089.00094: running TaskExecutor() for managed_node2/TASK: Test 28983 1726883089.00244: in run() - task 0affe814-3a2d-b16d-c0a7-000000001748 28983 1726883089.00269: variable 'ansible_search_path' from source: unknown 28983 1726883089.00272: variable 'ansible_search_path' from source: unknown 28983 1726883089.00324: variable 'lsr_test' from source: include params 28983 1726883089.00529: variable 'lsr_test' from source: include params 28983 1726883089.00598: variable 'omit' from source: magic vars 28983 1726883089.00799: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883089.00844: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883089.00867: variable 'omit' from source: magic vars 28983 1726883089.01274: variable 'ansible_distribution_major_version' from source: facts 28983 1726883089.01294: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883089.01308: variable 'item' from source: unknown 28983 1726883089.01454: variable 'item' from source: unknown 28983 1726883089.01657: variable 'item' from source: unknown 28983 1726883089.01909: variable 'item' from source: unknown 28983 1726883089.02492: dumping result to json 28983 1726883089.02495: done dumping result, returning 28983 1726883089.02497: done running TaskExecutor() for managed_node2/TASK: Test [0affe814-3a2d-b16d-c0a7-000000001748] 28983 1726883089.02499: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001748 28983 1726883089.02558: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001748 28983 1726883089.02562: WORKER PROCESS EXITING 28983 1726883089.02640: no more pending results, returning what we have 28983 1726883089.02647: in VariableManager get_vars() 28983 1726883089.02900: Calling all_inventory to load vars for managed_node2 28983 1726883089.02904: Calling groups_inventory to load vars for managed_node2 28983 1726883089.02909: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883089.02923: Calling all_plugins_play to load vars for managed_node2 28983 1726883089.02927: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883089.02931: Calling groups_plugins_play to load vars for managed_node2 28983 1726883089.05704: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883089.07632: done with get_vars() 28983 1726883089.07654: variable 'ansible_search_path' from source: unknown 28983 1726883089.07655: variable 'ansible_search_path' from source: unknown 28983 1726883089.07689: we have included files to process 28983 1726883089.07690: generating all_blocks data 28983 1726883089.07693: done generating all_blocks data 28983 1726883089.07703: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml 28983 1726883089.07704: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml 28983 1726883089.07708: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml 28983 1726883089.07873: done processing included file 28983 1726883089.07877: iterating over new_blocks loaded from include file 28983 1726883089.07878: in VariableManager get_vars() 28983 1726883089.07901: done with get_vars() 28983 1726883089.07903: filtering new block on tags 28983 1726883089.07924: done filtering new block on tags 28983 1726883089.07926: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml for managed_node2 => (item=tasks/remove+down_profile.yml) 28983 1726883089.07931: extending task lists for all hosts with included blocks 28983 1726883089.08838: done extending task lists 28983 1726883089.08840: done processing included files 28983 1726883089.08841: results queue empty 28983 1726883089.08842: checking for any_errors_fatal 28983 1726883089.08844: done checking for any_errors_fatal 28983 1726883089.08845: checking for max_fail_percentage 28983 1726883089.08846: done checking for max_fail_percentage 28983 1726883089.08846: checking to see if all hosts have failed and the running result is not ok 28983 1726883089.08847: done checking to see if all hosts have failed 28983 1726883089.08848: getting the remaining hosts for this loop 28983 1726883089.08849: done getting the remaining hosts for this loop 28983 1726883089.08851: getting the next task for host managed_node2 28983 1726883089.08854: done getting next task for host managed_node2 28983 1726883089.08856: ^ task is: TASK: Include network role 28983 1726883089.08858: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883089.08860: getting variables 28983 1726883089.08861: in VariableManager get_vars() 28983 1726883089.08870: Calling all_inventory to load vars for managed_node2 28983 1726883089.08872: Calling groups_inventory to load vars for managed_node2 28983 1726883089.08875: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883089.08879: Calling all_plugins_play to load vars for managed_node2 28983 1726883089.08881: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883089.08883: Calling groups_plugins_play to load vars for managed_node2 28983 1726883089.10250: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883089.12036: done with get_vars() 28983 1726883089.12058: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml:3 Friday 20 September 2024 21:44:49 -0400 (0:00:00.126) 0:01:59.119 ****** 28983 1726883089.12131: entering _queue_task() for managed_node2/include_role 28983 1726883089.12373: worker is 1 (out of 1 available) 28983 1726883089.12386: exiting _queue_task() for managed_node2/include_role 28983 1726883089.12398: done queuing things up, now waiting for results queue to drain 28983 1726883089.12400: waiting for pending results... 28983 1726883089.12608: running TaskExecutor() for managed_node2/TASK: Include network role 28983 1726883089.12709: in run() - task 0affe814-3a2d-b16d-c0a7-000000001ca9 28983 1726883089.12721: variable 'ansible_search_path' from source: unknown 28983 1726883089.12725: variable 'ansible_search_path' from source: unknown 28983 1726883089.12762: calling self._execute() 28983 1726883089.12850: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883089.12864: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883089.12869: variable 'omit' from source: magic vars 28983 1726883089.13223: variable 'ansible_distribution_major_version' from source: facts 28983 1726883089.13232: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883089.13241: _execute() done 28983 1726883089.13244: dumping result to json 28983 1726883089.13249: done dumping result, returning 28983 1726883089.13256: done running TaskExecutor() for managed_node2/TASK: Include network role [0affe814-3a2d-b16d-c0a7-000000001ca9] 28983 1726883089.13263: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001ca9 28983 1726883089.13380: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001ca9 28983 1726883089.13383: WORKER PROCESS EXITING 28983 1726883089.13419: no more pending results, returning what we have 28983 1726883089.13425: in VariableManager get_vars() 28983 1726883089.13473: Calling all_inventory to load vars for managed_node2 28983 1726883089.13476: Calling groups_inventory to load vars for managed_node2 28983 1726883089.13480: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883089.13491: Calling all_plugins_play to load vars for managed_node2 28983 1726883089.13496: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883089.13500: Calling groups_plugins_play to load vars for managed_node2 28983 1726883089.14986: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883089.16864: done with get_vars() 28983 1726883089.16886: variable 'ansible_search_path' from source: unknown 28983 1726883089.16887: variable 'ansible_search_path' from source: unknown 28983 1726883089.17044: variable 'omit' from source: magic vars 28983 1726883089.17078: variable 'omit' from source: magic vars 28983 1726883089.17096: variable 'omit' from source: magic vars 28983 1726883089.17100: we have included files to process 28983 1726883089.17101: generating all_blocks data 28983 1726883089.17103: done generating all_blocks data 28983 1726883089.17104: processing included file: fedora.linux_system_roles.network 28983 1726883089.17121: in VariableManager get_vars() 28983 1726883089.17135: done with get_vars() 28983 1726883089.17159: in VariableManager get_vars() 28983 1726883089.17180: done with get_vars() 28983 1726883089.17217: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 28983 1726883089.17343: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 28983 1726883089.17437: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 28983 1726883089.17811: in VariableManager get_vars() 28983 1726883089.17828: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 28983 1726883089.19796: iterating over new_blocks loaded from include file 28983 1726883089.19799: in VariableManager get_vars() 28983 1726883089.19816: done with get_vars() 28983 1726883089.19818: filtering new block on tags 28983 1726883089.20098: done filtering new block on tags 28983 1726883089.20101: in VariableManager get_vars() 28983 1726883089.20113: done with get_vars() 28983 1726883089.20114: filtering new block on tags 28983 1726883089.20128: done filtering new block on tags 28983 1726883089.20130: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node2 28983 1726883089.20136: extending task lists for all hosts with included blocks 28983 1726883089.20220: done extending task lists 28983 1726883089.20221: done processing included files 28983 1726883089.20222: results queue empty 28983 1726883089.20222: checking for any_errors_fatal 28983 1726883089.20226: done checking for any_errors_fatal 28983 1726883089.20227: checking for max_fail_percentage 28983 1726883089.20228: done checking for max_fail_percentage 28983 1726883089.20228: checking to see if all hosts have failed and the running result is not ok 28983 1726883089.20229: done checking to see if all hosts have failed 28983 1726883089.20229: getting the remaining hosts for this loop 28983 1726883089.20230: done getting the remaining hosts for this loop 28983 1726883089.20233: getting the next task for host managed_node2 28983 1726883089.20239: done getting next task for host managed_node2 28983 1726883089.20242: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 28983 1726883089.20244: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883089.20253: getting variables 28983 1726883089.20254: in VariableManager get_vars() 28983 1726883089.20265: Calling all_inventory to load vars for managed_node2 28983 1726883089.20267: Calling groups_inventory to load vars for managed_node2 28983 1726883089.20268: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883089.20272: Calling all_plugins_play to load vars for managed_node2 28983 1726883089.20274: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883089.20277: Calling groups_plugins_play to load vars for managed_node2 28983 1726883089.21596: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883089.23244: done with get_vars() 28983 1726883089.23265: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:44:49 -0400 (0:00:00.111) 0:01:59.231 ****** 28983 1726883089.23322: entering _queue_task() for managed_node2/include_tasks 28983 1726883089.23565: worker is 1 (out of 1 available) 28983 1726883089.23580: exiting _queue_task() for managed_node2/include_tasks 28983 1726883089.23593: done queuing things up, now waiting for results queue to drain 28983 1726883089.23595: waiting for pending results... 28983 1726883089.23796: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 28983 1726883089.23915: in run() - task 0affe814-3a2d-b16d-c0a7-000000001d2b 28983 1726883089.23929: variable 'ansible_search_path' from source: unknown 28983 1726883089.23933: variable 'ansible_search_path' from source: unknown 28983 1726883089.23968: calling self._execute() 28983 1726883089.24054: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883089.24059: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883089.24071: variable 'omit' from source: magic vars 28983 1726883089.24414: variable 'ansible_distribution_major_version' from source: facts 28983 1726883089.24424: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883089.24430: _execute() done 28983 1726883089.24435: dumping result to json 28983 1726883089.24441: done dumping result, returning 28983 1726883089.24448: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affe814-3a2d-b16d-c0a7-000000001d2b] 28983 1726883089.24454: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001d2b 28983 1726883089.24555: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001d2b 28983 1726883089.24558: WORKER PROCESS EXITING 28983 1726883089.24612: no more pending results, returning what we have 28983 1726883089.24617: in VariableManager get_vars() 28983 1726883089.24665: Calling all_inventory to load vars for managed_node2 28983 1726883089.24668: Calling groups_inventory to load vars for managed_node2 28983 1726883089.24671: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883089.24680: Calling all_plugins_play to load vars for managed_node2 28983 1726883089.24683: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883089.24687: Calling groups_plugins_play to load vars for managed_node2 28983 1726883089.26409: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883089.29028: done with get_vars() 28983 1726883089.29051: variable 'ansible_search_path' from source: unknown 28983 1726883089.29053: variable 'ansible_search_path' from source: unknown 28983 1726883089.29083: we have included files to process 28983 1726883089.29084: generating all_blocks data 28983 1726883089.29086: done generating all_blocks data 28983 1726883089.29088: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 28983 1726883089.29089: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 28983 1726883089.29091: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 28983 1726883089.29623: done processing included file 28983 1726883089.29625: iterating over new_blocks loaded from include file 28983 1726883089.29626: in VariableManager get_vars() 28983 1726883089.29659: done with get_vars() 28983 1726883089.29661: filtering new block on tags 28983 1726883089.29703: done filtering new block on tags 28983 1726883089.29707: in VariableManager get_vars() 28983 1726883089.29740: done with get_vars() 28983 1726883089.29743: filtering new block on tags 28983 1726883089.29807: done filtering new block on tags 28983 1726883089.29811: in VariableManager get_vars() 28983 1726883089.29843: done with get_vars() 28983 1726883089.29845: filtering new block on tags 28983 1726883089.29905: done filtering new block on tags 28983 1726883089.29908: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node2 28983 1726883089.29914: extending task lists for all hosts with included blocks 28983 1726883089.32409: done extending task lists 28983 1726883089.32410: done processing included files 28983 1726883089.32411: results queue empty 28983 1726883089.32411: checking for any_errors_fatal 28983 1726883089.32413: done checking for any_errors_fatal 28983 1726883089.32414: checking for max_fail_percentage 28983 1726883089.32415: done checking for max_fail_percentage 28983 1726883089.32415: checking to see if all hosts have failed and the running result is not ok 28983 1726883089.32416: done checking to see if all hosts have failed 28983 1726883089.32416: getting the remaining hosts for this loop 28983 1726883089.32417: done getting the remaining hosts for this loop 28983 1726883089.32420: getting the next task for host managed_node2 28983 1726883089.32423: done getting next task for host managed_node2 28983 1726883089.32425: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 28983 1726883089.32429: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883089.32440: getting variables 28983 1726883089.32441: in VariableManager get_vars() 28983 1726883089.32454: Calling all_inventory to load vars for managed_node2 28983 1726883089.32455: Calling groups_inventory to load vars for managed_node2 28983 1726883089.32457: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883089.32461: Calling all_plugins_play to load vars for managed_node2 28983 1726883089.32463: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883089.32465: Calling groups_plugins_play to load vars for managed_node2 28983 1726883089.34129: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883089.37256: done with get_vars() 28983 1726883089.37291: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:44:49 -0400 (0:00:00.140) 0:01:59.371 ****** 28983 1726883089.37381: entering _queue_task() for managed_node2/setup 28983 1726883089.37713: worker is 1 (out of 1 available) 28983 1726883089.37726: exiting _queue_task() for managed_node2/setup 28983 1726883089.37933: done queuing things up, now waiting for results queue to drain 28983 1726883089.37938: waiting for pending results... 28983 1726883089.38256: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 28983 1726883089.38394: in run() - task 0affe814-3a2d-b16d-c0a7-000000001d82 28983 1726883089.38420: variable 'ansible_search_path' from source: unknown 28983 1726883089.38430: variable 'ansible_search_path' from source: unknown 28983 1726883089.38494: calling self._execute() 28983 1726883089.38624: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883089.38642: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883089.38673: variable 'omit' from source: magic vars 28983 1726883089.39199: variable 'ansible_distribution_major_version' from source: facts 28983 1726883089.39232: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883089.39642: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883089.43043: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883089.43160: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883089.43258: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883089.43440: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883089.43496: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883089.43628: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883089.43679: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883089.43752: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883089.43794: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883089.43828: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883089.43910: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883089.43970: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883089.44030: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883089.44066: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883089.44099: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883089.44336: variable '__network_required_facts' from source: role '' defaults 28983 1726883089.44360: variable 'ansible_facts' from source: unknown 28983 1726883089.46348: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 28983 1726883089.46352: when evaluation is False, skipping this task 28983 1726883089.46355: _execute() done 28983 1726883089.46357: dumping result to json 28983 1726883089.46359: done dumping result, returning 28983 1726883089.46362: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affe814-3a2d-b16d-c0a7-000000001d82] 28983 1726883089.46364: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001d82 skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28983 1726883089.46511: no more pending results, returning what we have 28983 1726883089.46516: results queue empty 28983 1726883089.46517: checking for any_errors_fatal 28983 1726883089.46520: done checking for any_errors_fatal 28983 1726883089.46521: checking for max_fail_percentage 28983 1726883089.46523: done checking for max_fail_percentage 28983 1726883089.46524: checking to see if all hosts have failed and the running result is not ok 28983 1726883089.46525: done checking to see if all hosts have failed 28983 1726883089.46526: getting the remaining hosts for this loop 28983 1726883089.46529: done getting the remaining hosts for this loop 28983 1726883089.46542: getting the next task for host managed_node2 28983 1726883089.46565: done getting next task for host managed_node2 28983 1726883089.46573: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 28983 1726883089.46582: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883089.46615: getting variables 28983 1726883089.46617: in VariableManager get_vars() 28983 1726883089.46986: Calling all_inventory to load vars for managed_node2 28983 1726883089.46990: Calling groups_inventory to load vars for managed_node2 28983 1726883089.46993: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883089.47004: Calling all_plugins_play to load vars for managed_node2 28983 1726883089.47008: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883089.47013: Calling groups_plugins_play to load vars for managed_node2 28983 1726883089.47568: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001d82 28983 1726883089.47580: WORKER PROCESS EXITING 28983 1726883089.50201: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883089.53548: done with get_vars() 28983 1726883089.53607: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:44:49 -0400 (0:00:00.163) 0:01:59.535 ****** 28983 1726883089.53749: entering _queue_task() for managed_node2/stat 28983 1726883089.54192: worker is 1 (out of 1 available) 28983 1726883089.54207: exiting _queue_task() for managed_node2/stat 28983 1726883089.54452: done queuing things up, now waiting for results queue to drain 28983 1726883089.54455: waiting for pending results... 28983 1726883089.54577: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 28983 1726883089.54811: in run() - task 0affe814-3a2d-b16d-c0a7-000000001d84 28983 1726883089.54838: variable 'ansible_search_path' from source: unknown 28983 1726883089.54847: variable 'ansible_search_path' from source: unknown 28983 1726883089.54990: calling self._execute() 28983 1726883089.55041: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883089.55056: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883089.55079: variable 'omit' from source: magic vars 28983 1726883089.55587: variable 'ansible_distribution_major_version' from source: facts 28983 1726883089.55605: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883089.55844: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883089.56222: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883089.56283: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883089.56343: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883089.56389: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883089.56506: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883089.56556: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883089.56631: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883089.56654: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883089.56779: variable '__network_is_ostree' from source: set_fact 28983 1726883089.56844: Evaluated conditional (not __network_is_ostree is defined): False 28983 1726883089.56851: when evaluation is False, skipping this task 28983 1726883089.56854: _execute() done 28983 1726883089.56857: dumping result to json 28983 1726883089.56859: done dumping result, returning 28983 1726883089.56862: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affe814-3a2d-b16d-c0a7-000000001d84] 28983 1726883089.56864: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001d84 skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 28983 1726883089.57136: no more pending results, returning what we have 28983 1726883089.57141: results queue empty 28983 1726883089.57142: checking for any_errors_fatal 28983 1726883089.57155: done checking for any_errors_fatal 28983 1726883089.57156: checking for max_fail_percentage 28983 1726883089.57158: done checking for max_fail_percentage 28983 1726883089.57159: checking to see if all hosts have failed and the running result is not ok 28983 1726883089.57160: done checking to see if all hosts have failed 28983 1726883089.57161: getting the remaining hosts for this loop 28983 1726883089.57174: done getting the remaining hosts for this loop 28983 1726883089.57180: getting the next task for host managed_node2 28983 1726883089.57191: done getting next task for host managed_node2 28983 1726883089.57196: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 28983 1726883089.57203: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883089.57391: getting variables 28983 1726883089.57393: in VariableManager get_vars() 28983 1726883089.57444: Calling all_inventory to load vars for managed_node2 28983 1726883089.57448: Calling groups_inventory to load vars for managed_node2 28983 1726883089.57451: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883089.57458: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001d84 28983 1726883089.57461: WORKER PROCESS EXITING 28983 1726883089.57473: Calling all_plugins_play to load vars for managed_node2 28983 1726883089.57478: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883089.57482: Calling groups_plugins_play to load vars for managed_node2 28983 1726883089.60048: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883089.63487: done with get_vars() 28983 1726883089.63526: done getting variables 28983 1726883089.63613: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:44:49 -0400 (0:00:00.099) 0:01:59.634 ****** 28983 1726883089.63667: entering _queue_task() for managed_node2/set_fact 28983 1726883089.64113: worker is 1 (out of 1 available) 28983 1726883089.64352: exiting _queue_task() for managed_node2/set_fact 28983 1726883089.64366: done queuing things up, now waiting for results queue to drain 28983 1726883089.64368: waiting for pending results... 28983 1726883089.64504: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 28983 1726883089.64744: in run() - task 0affe814-3a2d-b16d-c0a7-000000001d85 28983 1726883089.64768: variable 'ansible_search_path' from source: unknown 28983 1726883089.64788: variable 'ansible_search_path' from source: unknown 28983 1726883089.64842: calling self._execute() 28983 1726883089.64967: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883089.64985: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883089.65013: variable 'omit' from source: magic vars 28983 1726883089.65512: variable 'ansible_distribution_major_version' from source: facts 28983 1726883089.65530: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883089.65787: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883089.66164: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883089.66233: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883089.66282: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883089.66331: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883089.66449: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883089.66486: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883089.66532: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883089.66576: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883089.66697: variable '__network_is_ostree' from source: set_fact 28983 1726883089.66746: Evaluated conditional (not __network_is_ostree is defined): False 28983 1726883089.66749: when evaluation is False, skipping this task 28983 1726883089.66753: _execute() done 28983 1726883089.66756: dumping result to json 28983 1726883089.66758: done dumping result, returning 28983 1726883089.66762: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affe814-3a2d-b16d-c0a7-000000001d85] 28983 1726883089.66767: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001d85 skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 28983 1726883089.67042: no more pending results, returning what we have 28983 1726883089.67047: results queue empty 28983 1726883089.67048: checking for any_errors_fatal 28983 1726883089.67055: done checking for any_errors_fatal 28983 1726883089.67057: checking for max_fail_percentage 28983 1726883089.67059: done checking for max_fail_percentage 28983 1726883089.67060: checking to see if all hosts have failed and the running result is not ok 28983 1726883089.67061: done checking to see if all hosts have failed 28983 1726883089.67236: getting the remaining hosts for this loop 28983 1726883089.67239: done getting the remaining hosts for this loop 28983 1726883089.67245: getting the next task for host managed_node2 28983 1726883089.67259: done getting next task for host managed_node2 28983 1726883089.67264: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 28983 1726883089.67274: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883089.67300: getting variables 28983 1726883089.67302: in VariableManager get_vars() 28983 1726883089.67359: Calling all_inventory to load vars for managed_node2 28983 1726883089.67363: Calling groups_inventory to load vars for managed_node2 28983 1726883089.67366: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883089.67376: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001d85 28983 1726883089.67379: WORKER PROCESS EXITING 28983 1726883089.67389: Calling all_plugins_play to load vars for managed_node2 28983 1726883089.67393: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883089.67398: Calling groups_plugins_play to load vars for managed_node2 28983 1726883089.69922: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883089.73313: done with get_vars() 28983 1726883089.73368: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:44:49 -0400 (0:00:00.098) 0:01:59.732 ****** 28983 1726883089.73514: entering _queue_task() for managed_node2/service_facts 28983 1726883089.74013: worker is 1 (out of 1 available) 28983 1726883089.74029: exiting _queue_task() for managed_node2/service_facts 28983 1726883089.74045: done queuing things up, now waiting for results queue to drain 28983 1726883089.74048: waiting for pending results... 28983 1726883089.74341: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running 28983 1726883089.74579: in run() - task 0affe814-3a2d-b16d-c0a7-000000001d87 28983 1726883089.74605: variable 'ansible_search_path' from source: unknown 28983 1726883089.74614: variable 'ansible_search_path' from source: unknown 28983 1726883089.74661: calling self._execute() 28983 1726883089.74796: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883089.74811: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883089.74829: variable 'omit' from source: magic vars 28983 1726883089.75646: variable 'ansible_distribution_major_version' from source: facts 28983 1726883089.75678: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883089.75693: variable 'omit' from source: magic vars 28983 1726883089.76040: variable 'omit' from source: magic vars 28983 1726883089.76043: variable 'omit' from source: magic vars 28983 1726883089.76096: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883089.76212: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883089.76302: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883089.76331: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883089.76403: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883089.76505: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883089.76514: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883089.76522: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883089.76777: Set connection var ansible_connection to ssh 28983 1726883089.76837: Set connection var ansible_shell_executable to /bin/sh 28983 1726883089.76882: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883089.76944: Set connection var ansible_timeout to 10 28983 1726883089.76959: Set connection var ansible_pipelining to False 28983 1726883089.76968: Set connection var ansible_shell_type to sh 28983 1726883089.77144: variable 'ansible_shell_executable' from source: unknown 28983 1726883089.77147: variable 'ansible_connection' from source: unknown 28983 1726883089.77151: variable 'ansible_module_compression' from source: unknown 28983 1726883089.77153: variable 'ansible_shell_type' from source: unknown 28983 1726883089.77155: variable 'ansible_shell_executable' from source: unknown 28983 1726883089.77157: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883089.77159: variable 'ansible_pipelining' from source: unknown 28983 1726883089.77161: variable 'ansible_timeout' from source: unknown 28983 1726883089.77163: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883089.77504: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28983 1726883089.77525: variable 'omit' from source: magic vars 28983 1726883089.77538: starting attempt loop 28983 1726883089.77546: running the handler 28983 1726883089.77564: _low_level_execute_command(): starting 28983 1726883089.77587: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726883089.78630: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883089.78686: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883089.78706: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883089.78735: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883089.78899: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883089.80653: stdout chunk (state=3): >>>/root <<< 28983 1726883089.80767: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883089.80831: stderr chunk (state=3): >>><<< 28983 1726883089.80852: stdout chunk (state=3): >>><<< 28983 1726883089.80873: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883089.80886: _low_level_execute_command(): starting 28983 1726883089.80897: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726883089.8087177-33304-54166979484851 `" && echo ansible-tmp-1726883089.8087177-33304-54166979484851="` echo /root/.ansible/tmp/ansible-tmp-1726883089.8087177-33304-54166979484851 `" ) && sleep 0' 28983 1726883089.81561: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883089.81573: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883089.81651: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883089.83654: stdout chunk (state=3): >>>ansible-tmp-1726883089.8087177-33304-54166979484851=/root/.ansible/tmp/ansible-tmp-1726883089.8087177-33304-54166979484851 <<< 28983 1726883089.83869: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883089.83875: stdout chunk (state=3): >>><<< 28983 1726883089.83899: stderr chunk (state=3): >>><<< 28983 1726883089.83974: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726883089.8087177-33304-54166979484851=/root/.ansible/tmp/ansible-tmp-1726883089.8087177-33304-54166979484851 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883089.83989: variable 'ansible_module_compression' from source: unknown 28983 1726883089.84049: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 28983 1726883089.84105: variable 'ansible_facts' from source: unknown 28983 1726883089.84215: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726883089.8087177-33304-54166979484851/AnsiballZ_service_facts.py 28983 1726883089.84384: Sending initial data 28983 1726883089.84387: Sent initial data (161 bytes) 28983 1726883089.84939: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883089.84943: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726883089.84946: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883089.84970: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883089.85026: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883089.85028: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883089.85095: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883089.86727: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 28983 1726883089.86733: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726883089.86796: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726883089.86864: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpxoactqqe /root/.ansible/tmp/ansible-tmp-1726883089.8087177-33304-54166979484851/AnsiballZ_service_facts.py <<< 28983 1726883089.86873: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726883089.8087177-33304-54166979484851/AnsiballZ_service_facts.py" <<< 28983 1726883089.86931: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpxoactqqe" to remote "/root/.ansible/tmp/ansible-tmp-1726883089.8087177-33304-54166979484851/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726883089.8087177-33304-54166979484851/AnsiballZ_service_facts.py" <<< 28983 1726883089.88119: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883089.88123: stderr chunk (state=3): >>><<< 28983 1726883089.88126: stdout chunk (state=3): >>><<< 28983 1726883089.88128: done transferring module to remote 28983 1726883089.88148: _low_level_execute_command(): starting 28983 1726883089.88151: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726883089.8087177-33304-54166979484851/ /root/.ansible/tmp/ansible-tmp-1726883089.8087177-33304-54166979484851/AnsiballZ_service_facts.py && sleep 0' 28983 1726883089.88670: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883089.88676: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883089.88711: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883089.88715: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883089.88717: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found <<< 28983 1726883089.88720: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883089.88780: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883089.88783: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883089.88860: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883089.90749: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883089.90801: stderr chunk (state=3): >>><<< 28983 1726883089.90805: stdout chunk (state=3): >>><<< 28983 1726883089.90818: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883089.90821: _low_level_execute_command(): starting 28983 1726883089.90827: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726883089.8087177-33304-54166979484851/AnsiballZ_service_facts.py && sleep 0' 28983 1726883089.91289: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883089.91293: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883089.91295: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883089.91298: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883089.91300: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883089.91355: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883089.91362: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883089.91440: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883091.88449: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"network": {"name": "network", "state": "running", "status": "enabled", "source": "sysv"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "import-state.service": {"name": "import-state.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "ip6tables.service": {"name": "ip6tables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "iptables.service": {"name": "iptables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "generated", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "loadmodules.service": {"name": "loadmodules.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 28983 1726883091.90051: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. <<< 28983 1726883091.90153: stderr chunk (state=3): >>><<< 28983 1726883091.90157: stdout chunk (state=3): >>><<< 28983 1726883091.90441: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"network": {"name": "network", "state": "running", "status": "enabled", "source": "sysv"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "import-state.service": {"name": "import-state.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "ip6tables.service": {"name": "ip6tables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "iptables.service": {"name": "iptables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "generated", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "loadmodules.service": {"name": "loadmodules.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726883091.93863: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726883089.8087177-33304-54166979484851/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726883091.93878: _low_level_execute_command(): starting 28983 1726883091.93884: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726883089.8087177-33304-54166979484851/ > /dev/null 2>&1 && sleep 0' 28983 1726883091.94942: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883091.94950: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883091.94966: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28983 1726883091.94981: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.46.139 is address <<< 28983 1726883091.94993: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28983 1726883091.95038: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883091.95106: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883091.95127: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883091.95359: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883091.95514: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883091.97550: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883091.97553: stdout chunk (state=3): >>><<< 28983 1726883091.97555: stderr chunk (state=3): >>><<< 28983 1726883091.97570: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883091.97582: handler run complete 28983 1726883091.98240: variable 'ansible_facts' from source: unknown 28983 1726883091.98474: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883092.00139: variable 'ansible_facts' from source: unknown 28983 1726883092.00548: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883092.01291: attempt loop complete, returning result 28983 1726883092.01305: _execute() done 28983 1726883092.01314: dumping result to json 28983 1726883092.01522: done dumping result, returning 28983 1726883092.01540: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running [0affe814-3a2d-b16d-c0a7-000000001d87] 28983 1726883092.01551: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001d87 ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28983 1726883092.04983: no more pending results, returning what we have 28983 1726883092.04986: results queue empty 28983 1726883092.04987: checking for any_errors_fatal 28983 1726883092.04992: done checking for any_errors_fatal 28983 1726883092.04993: checking for max_fail_percentage 28983 1726883092.04994: done checking for max_fail_percentage 28983 1726883092.04995: checking to see if all hosts have failed and the running result is not ok 28983 1726883092.04996: done checking to see if all hosts have failed 28983 1726883092.04997: getting the remaining hosts for this loop 28983 1726883092.04999: done getting the remaining hosts for this loop 28983 1726883092.05003: getting the next task for host managed_node2 28983 1726883092.05009: done getting next task for host managed_node2 28983 1726883092.05013: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 28983 1726883092.05020: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883092.05039: getting variables 28983 1726883092.05041: in VariableManager get_vars() 28983 1726883092.05081: Calling all_inventory to load vars for managed_node2 28983 1726883092.05084: Calling groups_inventory to load vars for managed_node2 28983 1726883092.05087: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883092.05097: Calling all_plugins_play to load vars for managed_node2 28983 1726883092.05101: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883092.05106: Calling groups_plugins_play to load vars for managed_node2 28983 1726883092.05625: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001d87 28983 1726883092.05635: WORKER PROCESS EXITING 28983 1726883092.10908: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883092.16005: done with get_vars() 28983 1726883092.16170: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:44:52 -0400 (0:00:02.428) 0:02:02.161 ****** 28983 1726883092.16413: entering _queue_task() for managed_node2/package_facts 28983 1726883092.17119: worker is 1 (out of 1 available) 28983 1726883092.17136: exiting _queue_task() for managed_node2/package_facts 28983 1726883092.17152: done queuing things up, now waiting for results queue to drain 28983 1726883092.17154: waiting for pending results... 28983 1726883092.17508: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 28983 1726883092.17942: in run() - task 0affe814-3a2d-b16d-c0a7-000000001d88 28983 1726883092.17947: variable 'ansible_search_path' from source: unknown 28983 1726883092.17950: variable 'ansible_search_path' from source: unknown 28983 1726883092.17954: calling self._execute() 28983 1726883092.17957: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883092.17961: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883092.17966: variable 'omit' from source: magic vars 28983 1726883092.18383: variable 'ansible_distribution_major_version' from source: facts 28983 1726883092.18400: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883092.18408: variable 'omit' from source: magic vars 28983 1726883092.18581: variable 'omit' from source: magic vars 28983 1726883092.18585: variable 'omit' from source: magic vars 28983 1726883092.18639: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883092.18684: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883092.18707: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883092.18732: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883092.18752: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883092.18936: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883092.18944: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883092.18948: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883092.19441: Set connection var ansible_connection to ssh 28983 1726883092.19445: Set connection var ansible_shell_executable to /bin/sh 28983 1726883092.19448: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883092.19452: Set connection var ansible_timeout to 10 28983 1726883092.19457: Set connection var ansible_pipelining to False 28983 1726883092.19460: Set connection var ansible_shell_type to sh 28983 1726883092.19463: variable 'ansible_shell_executable' from source: unknown 28983 1726883092.19466: variable 'ansible_connection' from source: unknown 28983 1726883092.19469: variable 'ansible_module_compression' from source: unknown 28983 1726883092.19472: variable 'ansible_shell_type' from source: unknown 28983 1726883092.19475: variable 'ansible_shell_executable' from source: unknown 28983 1726883092.19477: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883092.19480: variable 'ansible_pipelining' from source: unknown 28983 1726883092.19483: variable 'ansible_timeout' from source: unknown 28983 1726883092.19486: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883092.19931: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28983 1726883092.20144: variable 'omit' from source: magic vars 28983 1726883092.20150: starting attempt loop 28983 1726883092.20153: running the handler 28983 1726883092.20156: _low_level_execute_command(): starting 28983 1726883092.20159: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726883092.22755: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883092.22827: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883092.23041: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883092.23260: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883092.23373: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883092.25178: stdout chunk (state=3): >>>/root <<< 28983 1726883092.25331: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883092.25359: stderr chunk (state=3): >>><<< 28983 1726883092.25558: stdout chunk (state=3): >>><<< 28983 1726883092.25583: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883092.25770: _low_level_execute_command(): starting 28983 1726883092.25775: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726883092.2564812-33433-208591085644922 `" && echo ansible-tmp-1726883092.2564812-33433-208591085644922="` echo /root/.ansible/tmp/ansible-tmp-1726883092.2564812-33433-208591085644922 `" ) && sleep 0' 28983 1726883092.26894: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883092.26973: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883092.26992: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883092.27000: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration <<< 28983 1726883092.27010: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883092.27030: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726883092.27082: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883092.27178: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883092.27253: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883092.27356: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883092.29746: stdout chunk (state=3): >>>ansible-tmp-1726883092.2564812-33433-208591085644922=/root/.ansible/tmp/ansible-tmp-1726883092.2564812-33433-208591085644922 <<< 28983 1726883092.29750: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883092.29753: stdout chunk (state=3): >>><<< 28983 1726883092.29756: stderr chunk (state=3): >>><<< 28983 1726883092.29759: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726883092.2564812-33433-208591085644922=/root/.ansible/tmp/ansible-tmp-1726883092.2564812-33433-208591085644922 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883092.29762: variable 'ansible_module_compression' from source: unknown 28983 1726883092.29829: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 28983 1726883092.30133: variable 'ansible_facts' from source: unknown 28983 1726883092.30755: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726883092.2564812-33433-208591085644922/AnsiballZ_package_facts.py 28983 1726883092.31463: Sending initial data 28983 1726883092.31466: Sent initial data (162 bytes) 28983 1726883092.32852: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883092.32988: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883092.33002: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883092.33091: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883092.34850: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726883092.34873: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726883092.34954: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmp01gk2hu5 /root/.ansible/tmp/ansible-tmp-1726883092.2564812-33433-208591085644922/AnsiballZ_package_facts.py <<< 28983 1726883092.35063: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726883092.2564812-33433-208591085644922/AnsiballZ_package_facts.py" <<< 28983 1726883092.35103: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmp01gk2hu5" to remote "/root/.ansible/tmp/ansible-tmp-1726883092.2564812-33433-208591085644922/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726883092.2564812-33433-208591085644922/AnsiballZ_package_facts.py" <<< 28983 1726883092.39722: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883092.39996: stderr chunk (state=3): >>><<< 28983 1726883092.40000: stdout chunk (state=3): >>><<< 28983 1726883092.40003: done transferring module to remote 28983 1726883092.40005: _low_level_execute_command(): starting 28983 1726883092.40008: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726883092.2564812-33433-208591085644922/ /root/.ansible/tmp/ansible-tmp-1726883092.2564812-33433-208591085644922/AnsiballZ_package_facts.py && sleep 0' 28983 1726883092.41204: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found <<< 28983 1726883092.41313: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883092.41382: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883092.41480: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883092.41496: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883092.41745: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883092.43636: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883092.43640: stdout chunk (state=3): >>><<< 28983 1726883092.43642: stderr chunk (state=3): >>><<< 28983 1726883092.43660: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883092.43668: _low_level_execute_command(): starting 28983 1726883092.43679: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726883092.2564812-33433-208591085644922/AnsiballZ_package_facts.py && sleep 0' 28983 1726883092.44898: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883092.44915: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883092.45140: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883092.45178: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883092.45203: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883092.45345: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883093.08898: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "a<<< 28983 1726883093.08911: stdout chunk (state=3): >>>rch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "r<<< 28983 1726883093.08939: stdout chunk (state=3): >>>pm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "relea<<< 28983 1726883093.08966: stdout chunk (state=3): >>>se": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-<<< 28983 1726883093.08991: stdout chunk (state=3): >>>libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release"<<< 28983 1726883093.09013: stdout chunk (state=3): >>>: "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils",<<< 28983 1726883093.09028: stdout chunk (state=3): >>> "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.fc3<<< 28983 1726883093.09093: stdout chunk (state=3): >>>9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", <<< 28983 1726883093.09149: stdout chunk (state=3): >>>"release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_64", <<< 28983 1726883093.09194: stdout chunk (state=3): >>>"source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "9.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chkconfig": [{"name": "chkconfig", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts": [{"name": "initscripts", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.14.10", "releas<<< 28983 1726883093.09478: stdout chunk (state=3): >>>e": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "network-scripts": [{"name": "network-scripts", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 28983 1726883093.11047: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883093.11053: stderr chunk (state=3): >>>Shared connection to 10.31.46.139 closed. <<< 28983 1726883093.11115: stderr chunk (state=3): >>><<< 28983 1726883093.11119: stdout chunk (state=3): >>><<< 28983 1726883093.11159: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "9.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chkconfig": [{"name": "chkconfig", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts": [{"name": "initscripts", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "network-scripts": [{"name": "network-scripts", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726883093.15343: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726883092.2564812-33433-208591085644922/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726883093.15346: _low_level_execute_command(): starting 28983 1726883093.15349: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726883092.2564812-33433-208591085644922/ > /dev/null 2>&1 && sleep 0' 28983 1726883093.16006: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883093.16088: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883093.16114: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883093.16142: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883093.16245: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883093.18288: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883093.18303: stderr chunk (state=3): >>><<< 28983 1726883093.18312: stdout chunk (state=3): >>><<< 28983 1726883093.18331: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883093.18348: handler run complete 28983 1726883093.19866: variable 'ansible_facts' from source: unknown 28983 1726883093.27482: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883093.32849: variable 'ansible_facts' from source: unknown 28983 1726883093.34260: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883093.36469: attempt loop complete, returning result 28983 1726883093.36476: _execute() done 28983 1726883093.36479: dumping result to json 28983 1726883093.36803: done dumping result, returning 28983 1726883093.36821: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affe814-3a2d-b16d-c0a7-000000001d88] 28983 1726883093.36830: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001d88 28983 1726883093.50030: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001d88 28983 1726883093.50037: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28983 1726883093.50166: no more pending results, returning what we have 28983 1726883093.50169: results queue empty 28983 1726883093.50170: checking for any_errors_fatal 28983 1726883093.50177: done checking for any_errors_fatal 28983 1726883093.50178: checking for max_fail_percentage 28983 1726883093.50179: done checking for max_fail_percentage 28983 1726883093.50180: checking to see if all hosts have failed and the running result is not ok 28983 1726883093.50182: done checking to see if all hosts have failed 28983 1726883093.50182: getting the remaining hosts for this loop 28983 1726883093.50184: done getting the remaining hosts for this loop 28983 1726883093.50188: getting the next task for host managed_node2 28983 1726883093.50195: done getting next task for host managed_node2 28983 1726883093.50199: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 28983 1726883093.50206: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883093.50218: getting variables 28983 1726883093.50220: in VariableManager get_vars() 28983 1726883093.50247: Calling all_inventory to load vars for managed_node2 28983 1726883093.50251: Calling groups_inventory to load vars for managed_node2 28983 1726883093.50254: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883093.50261: Calling all_plugins_play to load vars for managed_node2 28983 1726883093.50265: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883093.50268: Calling groups_plugins_play to load vars for managed_node2 28983 1726883093.52574: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883093.55737: done with get_vars() 28983 1726883093.55788: done getting variables 28983 1726883093.55854: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:44:53 -0400 (0:00:01.394) 0:02:03.556 ****** 28983 1726883093.55899: entering _queue_task() for managed_node2/debug 28983 1726883093.56614: worker is 1 (out of 1 available) 28983 1726883093.56627: exiting _queue_task() for managed_node2/debug 28983 1726883093.56641: done queuing things up, now waiting for results queue to drain 28983 1726883093.56643: waiting for pending results... 28983 1726883093.57079: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 28983 1726883093.57616: in run() - task 0affe814-3a2d-b16d-c0a7-000000001d2c 28983 1726883093.57620: variable 'ansible_search_path' from source: unknown 28983 1726883093.57626: variable 'ansible_search_path' from source: unknown 28983 1726883093.57629: calling self._execute() 28983 1726883093.57723: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883093.57728: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883093.57743: variable 'omit' from source: magic vars 28983 1726883093.58236: variable 'ansible_distribution_major_version' from source: facts 28983 1726883093.58249: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883093.58256: variable 'omit' from source: magic vars 28983 1726883093.58345: variable 'omit' from source: magic vars 28983 1726883093.58471: variable 'network_provider' from source: set_fact 28983 1726883093.58496: variable 'omit' from source: magic vars 28983 1726883093.58546: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883093.58738: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883093.58743: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883093.58746: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883093.58750: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883093.58752: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883093.58755: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883093.58758: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883093.58815: Set connection var ansible_connection to ssh 28983 1726883093.58828: Set connection var ansible_shell_executable to /bin/sh 28983 1726883093.58842: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883093.58853: Set connection var ansible_timeout to 10 28983 1726883093.58861: Set connection var ansible_pipelining to False 28983 1726883093.58864: Set connection var ansible_shell_type to sh 28983 1726883093.58896: variable 'ansible_shell_executable' from source: unknown 28983 1726883093.58900: variable 'ansible_connection' from source: unknown 28983 1726883093.58903: variable 'ansible_module_compression' from source: unknown 28983 1726883093.58906: variable 'ansible_shell_type' from source: unknown 28983 1726883093.58985: variable 'ansible_shell_executable' from source: unknown 28983 1726883093.58989: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883093.58992: variable 'ansible_pipelining' from source: unknown 28983 1726883093.58994: variable 'ansible_timeout' from source: unknown 28983 1726883093.58997: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883093.59106: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883093.59120: variable 'omit' from source: magic vars 28983 1726883093.59127: starting attempt loop 28983 1726883093.59154: running the handler 28983 1726883093.59210: handler run complete 28983 1726883093.59228: attempt loop complete, returning result 28983 1726883093.59232: _execute() done 28983 1726883093.59238: dumping result to json 28983 1726883093.59240: done dumping result, returning 28983 1726883093.59313: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [0affe814-3a2d-b16d-c0a7-000000001d2c] 28983 1726883093.59317: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001d2c 28983 1726883093.59386: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001d2c 28983 1726883093.59389: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: Using network provider: nm 28983 1726883093.59501: no more pending results, returning what we have 28983 1726883093.59505: results queue empty 28983 1726883093.59506: checking for any_errors_fatal 28983 1726883093.59516: done checking for any_errors_fatal 28983 1726883093.59516: checking for max_fail_percentage 28983 1726883093.59518: done checking for max_fail_percentage 28983 1726883093.59520: checking to see if all hosts have failed and the running result is not ok 28983 1726883093.59521: done checking to see if all hosts have failed 28983 1726883093.59523: getting the remaining hosts for this loop 28983 1726883093.59525: done getting the remaining hosts for this loop 28983 1726883093.59531: getting the next task for host managed_node2 28983 1726883093.59540: done getting next task for host managed_node2 28983 1726883093.59545: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 28983 1726883093.59551: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883093.59564: getting variables 28983 1726883093.59566: in VariableManager get_vars() 28983 1726883093.59606: Calling all_inventory to load vars for managed_node2 28983 1726883093.59610: Calling groups_inventory to load vars for managed_node2 28983 1726883093.59612: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883093.59621: Calling all_plugins_play to load vars for managed_node2 28983 1726883093.59624: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883093.59628: Calling groups_plugins_play to load vars for managed_node2 28983 1726883093.64586: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883093.68338: done with get_vars() 28983 1726883093.68385: done getting variables 28983 1726883093.68537: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:44:53 -0400 (0:00:00.126) 0:02:03.683 ****** 28983 1726883093.68592: entering _queue_task() for managed_node2/fail 28983 1726883093.69216: worker is 1 (out of 1 available) 28983 1726883093.69230: exiting _queue_task() for managed_node2/fail 28983 1726883093.69245: done queuing things up, now waiting for results queue to drain 28983 1726883093.69247: waiting for pending results... 28983 1726883093.70054: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 28983 1726883093.70241: in run() - task 0affe814-3a2d-b16d-c0a7-000000001d2d 28983 1726883093.70440: variable 'ansible_search_path' from source: unknown 28983 1726883093.70445: variable 'ansible_search_path' from source: unknown 28983 1726883093.70463: calling self._execute() 28983 1726883093.70641: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883093.70645: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883093.70686: variable 'omit' from source: magic vars 28983 1726883093.71633: variable 'ansible_distribution_major_version' from source: facts 28983 1726883093.71724: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883093.72014: variable 'network_state' from source: role '' defaults 28983 1726883093.72159: Evaluated conditional (network_state != {}): False 28983 1726883093.72169: when evaluation is False, skipping this task 28983 1726883093.72178: _execute() done 28983 1726883093.72187: dumping result to json 28983 1726883093.72197: done dumping result, returning 28983 1726883093.72210: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affe814-3a2d-b16d-c0a7-000000001d2d] 28983 1726883093.72225: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001d2d 28983 1726883093.72564: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001d2d 28983 1726883093.72569: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28983 1726883093.72632: no more pending results, returning what we have 28983 1726883093.72639: results queue empty 28983 1726883093.72640: checking for any_errors_fatal 28983 1726883093.72650: done checking for any_errors_fatal 28983 1726883093.72651: checking for max_fail_percentage 28983 1726883093.72653: done checking for max_fail_percentage 28983 1726883093.72654: checking to see if all hosts have failed and the running result is not ok 28983 1726883093.72655: done checking to see if all hosts have failed 28983 1726883093.72656: getting the remaining hosts for this loop 28983 1726883093.72659: done getting the remaining hosts for this loop 28983 1726883093.72663: getting the next task for host managed_node2 28983 1726883093.72675: done getting next task for host managed_node2 28983 1726883093.72679: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 28983 1726883093.72688: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883093.72721: getting variables 28983 1726883093.72723: in VariableManager get_vars() 28983 1726883093.72879: Calling all_inventory to load vars for managed_node2 28983 1726883093.72883: Calling groups_inventory to load vars for managed_node2 28983 1726883093.72886: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883093.72897: Calling all_plugins_play to load vars for managed_node2 28983 1726883093.72900: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883093.72905: Calling groups_plugins_play to load vars for managed_node2 28983 1726883093.77713: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883093.83998: done with get_vars() 28983 1726883093.84163: done getting variables 28983 1726883093.84240: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:44:53 -0400 (0:00:00.157) 0:02:03.841 ****** 28983 1726883093.84378: entering _queue_task() for managed_node2/fail 28983 1726883093.85167: worker is 1 (out of 1 available) 28983 1726883093.85179: exiting _queue_task() for managed_node2/fail 28983 1726883093.85193: done queuing things up, now waiting for results queue to drain 28983 1726883093.85195: waiting for pending results... 28983 1726883093.85774: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 28983 1726883093.86161: in run() - task 0affe814-3a2d-b16d-c0a7-000000001d2e 28983 1726883093.86184: variable 'ansible_search_path' from source: unknown 28983 1726883093.86189: variable 'ansible_search_path' from source: unknown 28983 1726883093.86281: calling self._execute() 28983 1726883093.86553: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883093.86560: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883093.86574: variable 'omit' from source: magic vars 28983 1726883093.87648: variable 'ansible_distribution_major_version' from source: facts 28983 1726883093.87662: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883093.87825: variable 'network_state' from source: role '' defaults 28983 1726883093.87839: Evaluated conditional (network_state != {}): False 28983 1726883093.88046: when evaluation is False, skipping this task 28983 1726883093.88050: _execute() done 28983 1726883093.88054: dumping result to json 28983 1726883093.88057: done dumping result, returning 28983 1726883093.88067: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affe814-3a2d-b16d-c0a7-000000001d2e] 28983 1726883093.88078: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001d2e 28983 1726883093.88444: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001d2e 28983 1726883093.88447: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28983 1726883093.88502: no more pending results, returning what we have 28983 1726883093.88506: results queue empty 28983 1726883093.88507: checking for any_errors_fatal 28983 1726883093.88516: done checking for any_errors_fatal 28983 1726883093.88517: checking for max_fail_percentage 28983 1726883093.88519: done checking for max_fail_percentage 28983 1726883093.88520: checking to see if all hosts have failed and the running result is not ok 28983 1726883093.88521: done checking to see if all hosts have failed 28983 1726883093.88522: getting the remaining hosts for this loop 28983 1726883093.88523: done getting the remaining hosts for this loop 28983 1726883093.88528: getting the next task for host managed_node2 28983 1726883093.88537: done getting next task for host managed_node2 28983 1726883093.88542: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 28983 1726883093.88549: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883093.88578: getting variables 28983 1726883093.88579: in VariableManager get_vars() 28983 1726883093.88620: Calling all_inventory to load vars for managed_node2 28983 1726883093.88623: Calling groups_inventory to load vars for managed_node2 28983 1726883093.88625: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883093.88636: Calling all_plugins_play to load vars for managed_node2 28983 1726883093.88640: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883093.88644: Calling groups_plugins_play to load vars for managed_node2 28983 1726883093.93341: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883093.99705: done with get_vars() 28983 1726883093.99872: done getting variables 28983 1726883094.00065: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:44:53 -0400 (0:00:00.157) 0:02:03.998 ****** 28983 1726883094.00115: entering _queue_task() for managed_node2/fail 28983 1726883094.01049: worker is 1 (out of 1 available) 28983 1726883094.01065: exiting _queue_task() for managed_node2/fail 28983 1726883094.01083: done queuing things up, now waiting for results queue to drain 28983 1726883094.01085: waiting for pending results... 28983 1726883094.01570: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 28983 1726883094.02040: in run() - task 0affe814-3a2d-b16d-c0a7-000000001d2f 28983 1726883094.02044: variable 'ansible_search_path' from source: unknown 28983 1726883094.02047: variable 'ansible_search_path' from source: unknown 28983 1726883094.02050: calling self._execute() 28983 1726883094.02440: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883094.02446: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883094.02449: variable 'omit' from source: magic vars 28983 1726883094.03943: variable 'ansible_distribution_major_version' from source: facts 28983 1726883094.03946: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883094.04524: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883094.12751: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883094.12833: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883094.12986: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883094.13028: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883094.13061: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883094.13270: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883094.13380: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883094.13612: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883094.13615: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883094.13618: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883094.14039: variable 'ansible_distribution_major_version' from source: facts 28983 1726883094.14043: Evaluated conditional (ansible_distribution_major_version | int > 9): True 28983 1726883094.14238: variable 'ansible_distribution' from source: facts 28983 1726883094.14278: variable '__network_rh_distros' from source: role '' defaults 28983 1726883094.14438: Evaluated conditional (ansible_distribution in __network_rh_distros): False 28983 1726883094.14442: when evaluation is False, skipping this task 28983 1726883094.14444: _execute() done 28983 1726883094.14447: dumping result to json 28983 1726883094.14450: done dumping result, returning 28983 1726883094.14453: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affe814-3a2d-b16d-c0a7-000000001d2f] 28983 1726883094.14455: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001d2f skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution in __network_rh_distros", "skip_reason": "Conditional result was False" } 28983 1726883094.14614: no more pending results, returning what we have 28983 1726883094.14618: results queue empty 28983 1726883094.14620: checking for any_errors_fatal 28983 1726883094.14627: done checking for any_errors_fatal 28983 1726883094.14628: checking for max_fail_percentage 28983 1726883094.14630: done checking for max_fail_percentage 28983 1726883094.14632: checking to see if all hosts have failed and the running result is not ok 28983 1726883094.14633: done checking to see if all hosts have failed 28983 1726883094.14635: getting the remaining hosts for this loop 28983 1726883094.14638: done getting the remaining hosts for this loop 28983 1726883094.14643: getting the next task for host managed_node2 28983 1726883094.14653: done getting next task for host managed_node2 28983 1726883094.14658: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 28983 1726883094.14665: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883094.14698: getting variables 28983 1726883094.14700: in VariableManager get_vars() 28983 1726883094.15057: Calling all_inventory to load vars for managed_node2 28983 1726883094.15061: Calling groups_inventory to load vars for managed_node2 28983 1726883094.15064: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883094.15077: Calling all_plugins_play to load vars for managed_node2 28983 1726883094.15081: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883094.15085: Calling groups_plugins_play to load vars for managed_node2 28983 1726883094.16041: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001d2f 28983 1726883094.16045: WORKER PROCESS EXITING 28983 1726883094.21224: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883094.28459: done with get_vars() 28983 1726883094.28508: done getting variables 28983 1726883094.28709: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:44:54 -0400 (0:00:00.286) 0:02:04.285 ****** 28983 1726883094.28757: entering _queue_task() for managed_node2/dnf 28983 1726883094.29680: worker is 1 (out of 1 available) 28983 1726883094.29694: exiting _queue_task() for managed_node2/dnf 28983 1726883094.29708: done queuing things up, now waiting for results queue to drain 28983 1726883094.29710: waiting for pending results... 28983 1726883094.30164: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 28983 1726883094.30601: in run() - task 0affe814-3a2d-b16d-c0a7-000000001d30 28983 1726883094.30620: variable 'ansible_search_path' from source: unknown 28983 1726883094.30625: variable 'ansible_search_path' from source: unknown 28983 1726883094.30665: calling self._execute() 28983 1726883094.30977: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883094.30982: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883094.30996: variable 'omit' from source: magic vars 28983 1726883094.32040: variable 'ansible_distribution_major_version' from source: facts 28983 1726883094.32044: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883094.32690: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883094.38202: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883094.38418: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883094.38567: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883094.38608: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883094.38737: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883094.38836: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883094.38998: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883094.39029: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883094.39152: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883094.39270: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883094.39537: variable 'ansible_distribution' from source: facts 28983 1726883094.39541: variable 'ansible_distribution_major_version' from source: facts 28983 1726883094.39550: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 28983 1726883094.39808: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883094.40212: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883094.40345: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883094.40377: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883094.40478: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883094.40497: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883094.40665: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883094.40693: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883094.40838: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883094.40890: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883094.40906: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883094.41007: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883094.41037: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883094.41452: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883094.41456: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883094.41458: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883094.41781: variable 'network_connections' from source: include params 28983 1726883094.41794: variable 'interface' from source: play vars 28983 1726883094.41986: variable 'interface' from source: play vars 28983 1726883094.42210: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883094.42621: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883094.43039: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883094.43042: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883094.43045: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883094.43124: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883094.43216: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883094.43252: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883094.43287: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883094.43542: variable '__network_team_connections_defined' from source: role '' defaults 28983 1726883094.44598: variable 'network_connections' from source: include params 28983 1726883094.44604: variable 'interface' from source: play vars 28983 1726883094.44851: variable 'interface' from source: play vars 28983 1726883094.44881: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 28983 1726883094.44885: when evaluation is False, skipping this task 28983 1726883094.44887: _execute() done 28983 1726883094.44893: dumping result to json 28983 1726883094.44898: done dumping result, returning 28983 1726883094.44907: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affe814-3a2d-b16d-c0a7-000000001d30] 28983 1726883094.44914: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001d30 28983 1726883094.45152: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001d30 28983 1726883094.45155: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 28983 1726883094.45219: no more pending results, returning what we have 28983 1726883094.45223: results queue empty 28983 1726883094.45224: checking for any_errors_fatal 28983 1726883094.45231: done checking for any_errors_fatal 28983 1726883094.45232: checking for max_fail_percentage 28983 1726883094.45237: done checking for max_fail_percentage 28983 1726883094.45238: checking to see if all hosts have failed and the running result is not ok 28983 1726883094.45240: done checking to see if all hosts have failed 28983 1726883094.45240: getting the remaining hosts for this loop 28983 1726883094.45243: done getting the remaining hosts for this loop 28983 1726883094.45249: getting the next task for host managed_node2 28983 1726883094.45259: done getting next task for host managed_node2 28983 1726883094.45264: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 28983 1726883094.45271: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883094.45305: getting variables 28983 1726883094.45307: in VariableManager get_vars() 28983 1726883094.45866: Calling all_inventory to load vars for managed_node2 28983 1726883094.45870: Calling groups_inventory to load vars for managed_node2 28983 1726883094.45873: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883094.45884: Calling all_plugins_play to load vars for managed_node2 28983 1726883094.45888: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883094.45892: Calling groups_plugins_play to load vars for managed_node2 28983 1726883094.52232: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883094.56618: done with get_vars() 28983 1726883094.56660: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 28983 1726883094.56752: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:44:54 -0400 (0:00:00.280) 0:02:04.565 ****** 28983 1726883094.56795: entering _queue_task() for managed_node2/yum 28983 1726883094.57640: worker is 1 (out of 1 available) 28983 1726883094.57654: exiting _queue_task() for managed_node2/yum 28983 1726883094.57668: done queuing things up, now waiting for results queue to drain 28983 1726883094.57670: waiting for pending results... 28983 1726883094.58493: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 28983 1726883094.58499: in run() - task 0affe814-3a2d-b16d-c0a7-000000001d31 28983 1726883094.58513: variable 'ansible_search_path' from source: unknown 28983 1726883094.58517: variable 'ansible_search_path' from source: unknown 28983 1726883094.58718: calling self._execute() 28983 1726883094.58948: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883094.58955: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883094.58968: variable 'omit' from source: magic vars 28983 1726883094.59507: variable 'ansible_distribution_major_version' from source: facts 28983 1726883094.59519: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883094.59787: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883094.64269: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883094.64318: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883094.64366: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883094.64410: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883094.64443: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883094.64705: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883094.64710: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883094.64714: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883094.64717: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883094.64732: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883094.64946: variable 'ansible_distribution_major_version' from source: facts 28983 1726883094.64963: Evaluated conditional (ansible_distribution_major_version | int < 8): False 28983 1726883094.64966: when evaluation is False, skipping this task 28983 1726883094.64970: _execute() done 28983 1726883094.64977: dumping result to json 28983 1726883094.64982: done dumping result, returning 28983 1726883094.64991: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affe814-3a2d-b16d-c0a7-000000001d31] 28983 1726883094.64996: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001d31 28983 1726883094.65208: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001d31 skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 28983 1726883094.65273: no more pending results, returning what we have 28983 1726883094.65277: results queue empty 28983 1726883094.65278: checking for any_errors_fatal 28983 1726883094.65288: done checking for any_errors_fatal 28983 1726883094.65289: checking for max_fail_percentage 28983 1726883094.65291: done checking for max_fail_percentage 28983 1726883094.65292: checking to see if all hosts have failed and the running result is not ok 28983 1726883094.65293: done checking to see if all hosts have failed 28983 1726883094.65294: getting the remaining hosts for this loop 28983 1726883094.65296: done getting the remaining hosts for this loop 28983 1726883094.65302: getting the next task for host managed_node2 28983 1726883094.65311: done getting next task for host managed_node2 28983 1726883094.65317: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 28983 1726883094.65324: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883094.65338: WORKER PROCESS EXITING 28983 1726883094.65455: getting variables 28983 1726883094.65457: in VariableManager get_vars() 28983 1726883094.65507: Calling all_inventory to load vars for managed_node2 28983 1726883094.65510: Calling groups_inventory to load vars for managed_node2 28983 1726883094.65513: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883094.65523: Calling all_plugins_play to load vars for managed_node2 28983 1726883094.65527: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883094.65531: Calling groups_plugins_play to load vars for managed_node2 28983 1726883094.69254: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883094.73944: done with get_vars() 28983 1726883094.73996: done getting variables 28983 1726883094.74105: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:44:54 -0400 (0:00:00.173) 0:02:04.739 ****** 28983 1726883094.74224: entering _queue_task() for managed_node2/fail 28983 1726883094.74744: worker is 1 (out of 1 available) 28983 1726883094.74756: exiting _queue_task() for managed_node2/fail 28983 1726883094.74774: done queuing things up, now waiting for results queue to drain 28983 1726883094.74776: waiting for pending results... 28983 1726883094.75112: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 28983 1726883094.75187: in run() - task 0affe814-3a2d-b16d-c0a7-000000001d32 28983 1726883094.75207: variable 'ansible_search_path' from source: unknown 28983 1726883094.75210: variable 'ansible_search_path' from source: unknown 28983 1726883094.75252: calling self._execute() 28983 1726883094.75382: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883094.75388: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883094.75401: variable 'omit' from source: magic vars 28983 1726883094.75895: variable 'ansible_distribution_major_version' from source: facts 28983 1726883094.75908: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883094.76067: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883094.76437: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883094.82792: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883094.82870: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883094.82914: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883094.83158: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883094.83193: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883094.83288: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883094.83327: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883094.83460: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883094.83544: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883094.83553: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883094.83697: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883094.83726: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883094.83982: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883094.84014: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883094.84091: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883094.84099: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883094.84112: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883094.84347: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883094.84400: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883094.84418: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883094.84841: variable 'network_connections' from source: include params 28983 1726883094.84860: variable 'interface' from source: play vars 28983 1726883094.85142: variable 'interface' from source: play vars 28983 1726883094.85233: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883094.85652: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883094.85737: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883094.85742: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883094.85977: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883094.86028: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883094.86077: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883094.86139: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883094.86143: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883094.86387: variable '__network_team_connections_defined' from source: role '' defaults 28983 1726883094.86920: variable 'network_connections' from source: include params 28983 1726883094.86933: variable 'interface' from source: play vars 28983 1726883094.87209: variable 'interface' from source: play vars 28983 1726883094.87238: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 28983 1726883094.87242: when evaluation is False, skipping this task 28983 1726883094.87245: _execute() done 28983 1726883094.87262: dumping result to json 28983 1726883094.87340: done dumping result, returning 28983 1726883094.87344: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affe814-3a2d-b16d-c0a7-000000001d32] 28983 1726883094.87346: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001d32 skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 28983 1726883094.87550: no more pending results, returning what we have 28983 1726883094.87554: results queue empty 28983 1726883094.87555: checking for any_errors_fatal 28983 1726883094.87564: done checking for any_errors_fatal 28983 1726883094.87565: checking for max_fail_percentage 28983 1726883094.87567: done checking for max_fail_percentage 28983 1726883094.87568: checking to see if all hosts have failed and the running result is not ok 28983 1726883094.87569: done checking to see if all hosts have failed 28983 1726883094.87570: getting the remaining hosts for this loop 28983 1726883094.87572: done getting the remaining hosts for this loop 28983 1726883094.87577: getting the next task for host managed_node2 28983 1726883094.87588: done getting next task for host managed_node2 28983 1726883094.87594: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 28983 1726883094.87600: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883094.87628: getting variables 28983 1726883094.87630: in VariableManager get_vars() 28983 1726883094.87679: Calling all_inventory to load vars for managed_node2 28983 1726883094.87683: Calling groups_inventory to load vars for managed_node2 28983 1726883094.87686: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883094.87695: Calling all_plugins_play to load vars for managed_node2 28983 1726883094.87699: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883094.87703: Calling groups_plugins_play to load vars for managed_node2 28983 1726883094.88271: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001d32 28983 1726883094.88275: WORKER PROCESS EXITING 28983 1726883094.93087: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883094.99126: done with get_vars() 28983 1726883094.99183: done getting variables 28983 1726883094.99275: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:44:54 -0400 (0:00:00.251) 0:02:04.991 ****** 28983 1726883094.99324: entering _queue_task() for managed_node2/package 28983 1726883094.99936: worker is 1 (out of 1 available) 28983 1726883094.99946: exiting _queue_task() for managed_node2/package 28983 1726883094.99959: done queuing things up, now waiting for results queue to drain 28983 1726883094.99960: waiting for pending results... 28983 1726883095.00205: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 28983 1726883095.00421: in run() - task 0affe814-3a2d-b16d-c0a7-000000001d33 28983 1726883095.00450: variable 'ansible_search_path' from source: unknown 28983 1726883095.00530: variable 'ansible_search_path' from source: unknown 28983 1726883095.00535: calling self._execute() 28983 1726883095.00667: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883095.00687: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883095.00704: variable 'omit' from source: magic vars 28983 1726883095.01224: variable 'ansible_distribution_major_version' from source: facts 28983 1726883095.01244: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883095.01622: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883095.01939: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883095.02012: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883095.02069: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883095.02173: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883095.02333: variable 'network_packages' from source: role '' defaults 28983 1726883095.02498: variable '__network_provider_setup' from source: role '' defaults 28983 1726883095.02527: variable '__network_service_name_default_nm' from source: role '' defaults 28983 1726883095.02729: variable '__network_service_name_default_nm' from source: role '' defaults 28983 1726883095.02734: variable '__network_packages_default_nm' from source: role '' defaults 28983 1726883095.02737: variable '__network_packages_default_nm' from source: role '' defaults 28983 1726883095.03024: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883095.05739: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883095.05828: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883095.05876: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883095.05928: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883095.06016: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883095.06081: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883095.06122: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883095.06172: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883095.06225: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883095.06264: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883095.06321: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883095.06372: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883095.06408: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883095.06477: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883095.06503: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883095.06847: variable '__network_packages_default_gobject_packages' from source: role '' defaults 28983 1726883095.06998: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883095.07043: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883095.07078: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883095.07145: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883095.07167: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883095.07326: variable 'ansible_python' from source: facts 28983 1726883095.07331: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 28983 1726883095.07420: variable '__network_wpa_supplicant_required' from source: role '' defaults 28983 1726883095.07531: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 28983 1726883095.07727: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883095.07841: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883095.07845: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883095.07896: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883095.07919: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883095.07996: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883095.08042: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883095.08078: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883095.08143: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883095.08166: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883095.08416: variable 'network_connections' from source: include params 28983 1726883095.08419: variable 'interface' from source: play vars 28983 1726883095.08539: variable 'interface' from source: play vars 28983 1726883095.08877: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883095.08925: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883095.08978: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883095.09040: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883095.09114: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883095.09517: variable 'network_connections' from source: include params 28983 1726883095.09528: variable 'interface' from source: play vars 28983 1726883095.09672: variable 'interface' from source: play vars 28983 1726883095.09939: variable '__network_packages_default_wireless' from source: role '' defaults 28983 1726883095.09942: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883095.10287: variable 'network_connections' from source: include params 28983 1726883095.10297: variable 'interface' from source: play vars 28983 1726883095.10385: variable 'interface' from source: play vars 28983 1726883095.10421: variable '__network_packages_default_team' from source: role '' defaults 28983 1726883095.10544: variable '__network_team_connections_defined' from source: role '' defaults 28983 1726883095.11002: variable 'network_connections' from source: include params 28983 1726883095.11013: variable 'interface' from source: play vars 28983 1726883095.11107: variable 'interface' from source: play vars 28983 1726883095.11196: variable '__network_service_name_default_initscripts' from source: role '' defaults 28983 1726883095.11290: variable '__network_service_name_default_initscripts' from source: role '' defaults 28983 1726883095.11304: variable '__network_packages_default_initscripts' from source: role '' defaults 28983 1726883095.11441: variable '__network_packages_default_initscripts' from source: role '' defaults 28983 1726883095.11740: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 28983 1726883095.12490: variable 'network_connections' from source: include params 28983 1726883095.12502: variable 'interface' from source: play vars 28983 1726883095.12595: variable 'interface' from source: play vars 28983 1726883095.12610: variable 'ansible_distribution' from source: facts 28983 1726883095.12620: variable '__network_rh_distros' from source: role '' defaults 28983 1726883095.12633: variable 'ansible_distribution_major_version' from source: facts 28983 1726883095.12682: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 28983 1726883095.12922: variable 'ansible_distribution' from source: facts 28983 1726883095.12932: variable '__network_rh_distros' from source: role '' defaults 28983 1726883095.12945: variable 'ansible_distribution_major_version' from source: facts 28983 1726883095.12957: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 28983 1726883095.13204: variable 'ansible_distribution' from source: facts 28983 1726883095.13213: variable '__network_rh_distros' from source: role '' defaults 28983 1726883095.13238: variable 'ansible_distribution_major_version' from source: facts 28983 1726883095.13335: variable 'network_provider' from source: set_fact 28983 1726883095.13344: variable 'ansible_facts' from source: unknown 28983 1726883095.14820: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 28983 1726883095.14830: when evaluation is False, skipping this task 28983 1726883095.14841: _execute() done 28983 1726883095.14850: dumping result to json 28983 1726883095.14940: done dumping result, returning 28983 1726883095.14955: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [0affe814-3a2d-b16d-c0a7-000000001d33] 28983 1726883095.14966: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001d33 skipping: [managed_node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 28983 1726883095.15158: no more pending results, returning what we have 28983 1726883095.15163: results queue empty 28983 1726883095.15164: checking for any_errors_fatal 28983 1726883095.15172: done checking for any_errors_fatal 28983 1726883095.15173: checking for max_fail_percentage 28983 1726883095.15175: done checking for max_fail_percentage 28983 1726883095.15176: checking to see if all hosts have failed and the running result is not ok 28983 1726883095.15177: done checking to see if all hosts have failed 28983 1726883095.15178: getting the remaining hosts for this loop 28983 1726883095.15180: done getting the remaining hosts for this loop 28983 1726883095.15186: getting the next task for host managed_node2 28983 1726883095.15198: done getting next task for host managed_node2 28983 1726883095.15205: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 28983 1726883095.15212: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883095.15260: getting variables 28983 1726883095.15262: in VariableManager get_vars() 28983 1726883095.15316: Calling all_inventory to load vars for managed_node2 28983 1726883095.15320: Calling groups_inventory to load vars for managed_node2 28983 1726883095.15546: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883095.15664: Calling all_plugins_play to load vars for managed_node2 28983 1726883095.15669: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883095.15673: Calling groups_plugins_play to load vars for managed_node2 28983 1726883095.16472: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001d33 28983 1726883095.16476: WORKER PROCESS EXITING 28983 1726883095.21377: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883095.28075: done with get_vars() 28983 1726883095.28124: done getting variables 28983 1726883095.28360: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:44:55 -0400 (0:00:00.291) 0:02:05.282 ****** 28983 1726883095.28469: entering _queue_task() for managed_node2/package 28983 1726883095.29358: worker is 1 (out of 1 available) 28983 1726883095.29376: exiting _queue_task() for managed_node2/package 28983 1726883095.29390: done queuing things up, now waiting for results queue to drain 28983 1726883095.29392: waiting for pending results... 28983 1726883095.29709: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 28983 1726883095.30175: in run() - task 0affe814-3a2d-b16d-c0a7-000000001d34 28983 1726883095.30179: variable 'ansible_search_path' from source: unknown 28983 1726883095.30181: variable 'ansible_search_path' from source: unknown 28983 1726883095.30594: calling self._execute() 28983 1726883095.30714: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883095.30722: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883095.30737: variable 'omit' from source: magic vars 28983 1726883095.31790: variable 'ansible_distribution_major_version' from source: facts 28983 1726883095.31796: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883095.32011: variable 'network_state' from source: role '' defaults 28983 1726883095.32024: Evaluated conditional (network_state != {}): False 28983 1726883095.32028: when evaluation is False, skipping this task 28983 1726883095.32031: _execute() done 28983 1726883095.32227: dumping result to json 28983 1726883095.32230: done dumping result, returning 28983 1726883095.32234: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affe814-3a2d-b16d-c0a7-000000001d34] 28983 1726883095.32237: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001d34 28983 1726883095.32315: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001d34 28983 1726883095.32318: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28983 1726883095.32368: no more pending results, returning what we have 28983 1726883095.32372: results queue empty 28983 1726883095.32373: checking for any_errors_fatal 28983 1726883095.32378: done checking for any_errors_fatal 28983 1726883095.32379: checking for max_fail_percentage 28983 1726883095.32381: done checking for max_fail_percentage 28983 1726883095.32382: checking to see if all hosts have failed and the running result is not ok 28983 1726883095.32383: done checking to see if all hosts have failed 28983 1726883095.32384: getting the remaining hosts for this loop 28983 1726883095.32385: done getting the remaining hosts for this loop 28983 1726883095.32389: getting the next task for host managed_node2 28983 1726883095.32397: done getting next task for host managed_node2 28983 1726883095.32401: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 28983 1726883095.32407: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883095.32430: getting variables 28983 1726883095.32431: in VariableManager get_vars() 28983 1726883095.32590: Calling all_inventory to load vars for managed_node2 28983 1726883095.32594: Calling groups_inventory to load vars for managed_node2 28983 1726883095.32597: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883095.32607: Calling all_plugins_play to load vars for managed_node2 28983 1726883095.32611: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883095.32615: Calling groups_plugins_play to load vars for managed_node2 28983 1726883095.38309: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883095.43953: done with get_vars() 28983 1726883095.43996: done getting variables 28983 1726883095.44190: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:44:55 -0400 (0:00:00.157) 0:02:05.440 ****** 28983 1726883095.44237: entering _queue_task() for managed_node2/package 28983 1726883095.45244: worker is 1 (out of 1 available) 28983 1726883095.45258: exiting _queue_task() for managed_node2/package 28983 1726883095.45270: done queuing things up, now waiting for results queue to drain 28983 1726883095.45272: waiting for pending results... 28983 1726883095.46360: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 28983 1726883095.46366: in run() - task 0affe814-3a2d-b16d-c0a7-000000001d35 28983 1726883095.46369: variable 'ansible_search_path' from source: unknown 28983 1726883095.46372: variable 'ansible_search_path' from source: unknown 28983 1726883095.46375: calling self._execute() 28983 1726883095.46431: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883095.46438: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883095.46452: variable 'omit' from source: magic vars 28983 1726883095.47306: variable 'ansible_distribution_major_version' from source: facts 28983 1726883095.47320: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883095.47800: variable 'network_state' from source: role '' defaults 28983 1726883095.47806: Evaluated conditional (network_state != {}): False 28983 1726883095.47808: when evaluation is False, skipping this task 28983 1726883095.47811: _execute() done 28983 1726883095.47814: dumping result to json 28983 1726883095.47817: done dumping result, returning 28983 1726883095.47820: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affe814-3a2d-b16d-c0a7-000000001d35] 28983 1726883095.47822: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001d35 28983 1726883095.48429: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001d35 28983 1726883095.48432: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28983 1726883095.48478: no more pending results, returning what we have 28983 1726883095.48483: results queue empty 28983 1726883095.48484: checking for any_errors_fatal 28983 1726883095.48491: done checking for any_errors_fatal 28983 1726883095.48492: checking for max_fail_percentage 28983 1726883095.48495: done checking for max_fail_percentage 28983 1726883095.48496: checking to see if all hosts have failed and the running result is not ok 28983 1726883095.48497: done checking to see if all hosts have failed 28983 1726883095.48498: getting the remaining hosts for this loop 28983 1726883095.48500: done getting the remaining hosts for this loop 28983 1726883095.48504: getting the next task for host managed_node2 28983 1726883095.48513: done getting next task for host managed_node2 28983 1726883095.48523: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 28983 1726883095.48529: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883095.48556: getting variables 28983 1726883095.48557: in VariableManager get_vars() 28983 1726883095.48601: Calling all_inventory to load vars for managed_node2 28983 1726883095.48604: Calling groups_inventory to load vars for managed_node2 28983 1726883095.48607: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883095.48617: Calling all_plugins_play to load vars for managed_node2 28983 1726883095.48621: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883095.48937: Calling groups_plugins_play to load vars for managed_node2 28983 1726883095.54469: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883095.59701: done with get_vars() 28983 1726883095.59759: done getting variables 28983 1726883095.59848: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:44:55 -0400 (0:00:00.156) 0:02:05.596 ****** 28983 1726883095.59903: entering _queue_task() for managed_node2/service 28983 1726883095.60476: worker is 1 (out of 1 available) 28983 1726883095.60488: exiting _queue_task() for managed_node2/service 28983 1726883095.60501: done queuing things up, now waiting for results queue to drain 28983 1726883095.60503: waiting for pending results... 28983 1726883095.61079: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 28983 1726883095.61520: in run() - task 0affe814-3a2d-b16d-c0a7-000000001d36 28983 1726883095.61722: variable 'ansible_search_path' from source: unknown 28983 1726883095.61726: variable 'ansible_search_path' from source: unknown 28983 1726883095.61729: calling self._execute() 28983 1726883095.61926: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883095.61976: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883095.62004: variable 'omit' from source: magic vars 28983 1726883095.62509: variable 'ansible_distribution_major_version' from source: facts 28983 1726883095.62530: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883095.62717: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883095.63018: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883095.65926: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883095.66026: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883095.66241: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883095.66245: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883095.66265: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883095.66372: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883095.66425: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883095.66474: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883095.66531: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883095.66558: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883095.66629: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883095.66666: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883095.66712: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883095.66771: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883095.66805: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883095.66864: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883095.67010: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883095.67014: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883095.67017: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883095.67019: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883095.67263: variable 'network_connections' from source: include params 28983 1726883095.67284: variable 'interface' from source: play vars 28983 1726883095.67374: variable 'interface' from source: play vars 28983 1726883095.67475: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883095.67696: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883095.67790: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883095.68143: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883095.68147: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883095.68149: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883095.68254: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883095.68257: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883095.68260: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883095.68303: variable '__network_team_connections_defined' from source: role '' defaults 28983 1726883095.69023: variable 'network_connections' from source: include params 28983 1726883095.69040: variable 'interface' from source: play vars 28983 1726883095.69188: variable 'interface' from source: play vars 28983 1726883095.69270: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 28983 1726883095.69293: when evaluation is False, skipping this task 28983 1726883095.69349: _execute() done 28983 1726883095.69367: dumping result to json 28983 1726883095.69379: done dumping result, returning 28983 1726883095.69399: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affe814-3a2d-b16d-c0a7-000000001d36] 28983 1726883095.69453: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001d36 skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 28983 1726883095.69813: no more pending results, returning what we have 28983 1726883095.69819: results queue empty 28983 1726883095.69820: checking for any_errors_fatal 28983 1726883095.69829: done checking for any_errors_fatal 28983 1726883095.69831: checking for max_fail_percentage 28983 1726883095.69835: done checking for max_fail_percentage 28983 1726883095.69836: checking to see if all hosts have failed and the running result is not ok 28983 1726883095.69837: done checking to see if all hosts have failed 28983 1726883095.69838: getting the remaining hosts for this loop 28983 1726883095.69841: done getting the remaining hosts for this loop 28983 1726883095.69847: getting the next task for host managed_node2 28983 1726883095.69858: done getting next task for host managed_node2 28983 1726883095.69865: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 28983 1726883095.69874: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883095.69909: getting variables 28983 1726883095.69912: in VariableManager get_vars() 28983 1726883095.70217: Calling all_inventory to load vars for managed_node2 28983 1726883095.70221: Calling groups_inventory to load vars for managed_node2 28983 1726883095.70224: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883095.70347: Calling all_plugins_play to load vars for managed_node2 28983 1726883095.70352: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883095.70356: Calling groups_plugins_play to load vars for managed_node2 28983 1726883095.71162: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001d36 28983 1726883095.71167: WORKER PROCESS EXITING 28983 1726883095.75917: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883095.82401: done with get_vars() 28983 1726883095.82458: done getting variables 28983 1726883095.82663: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:44:55 -0400 (0:00:00.228) 0:02:05.824 ****** 28983 1726883095.82710: entering _queue_task() for managed_node2/service 28983 1726883095.83777: worker is 1 (out of 1 available) 28983 1726883095.83792: exiting _queue_task() for managed_node2/service 28983 1726883095.83855: done queuing things up, now waiting for results queue to drain 28983 1726883095.83863: waiting for pending results... 28983 1726883095.84623: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 28983 1726883095.85158: in run() - task 0affe814-3a2d-b16d-c0a7-000000001d37 28983 1726883095.85293: variable 'ansible_search_path' from source: unknown 28983 1726883095.85299: variable 'ansible_search_path' from source: unknown 28983 1726883095.85372: calling self._execute() 28983 1726883095.85912: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883095.85918: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883095.86003: variable 'omit' from source: magic vars 28983 1726883095.87480: variable 'ansible_distribution_major_version' from source: facts 28983 1726883095.87495: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883095.88057: variable 'network_provider' from source: set_fact 28983 1726883095.88066: variable 'network_state' from source: role '' defaults 28983 1726883095.88084: Evaluated conditional (network_provider == "nm" or network_state != {}): True 28983 1726883095.88088: variable 'omit' from source: magic vars 28983 1726883095.88316: variable 'omit' from source: magic vars 28983 1726883095.88348: variable 'network_service_name' from source: role '' defaults 28983 1726883095.88567: variable 'network_service_name' from source: role '' defaults 28983 1726883095.89050: variable '__network_provider_setup' from source: role '' defaults 28983 1726883095.89057: variable '__network_service_name_default_nm' from source: role '' defaults 28983 1726883095.89138: variable '__network_service_name_default_nm' from source: role '' defaults 28983 1726883095.89218: variable '__network_packages_default_nm' from source: role '' defaults 28983 1726883095.89512: variable '__network_packages_default_nm' from source: role '' defaults 28983 1726883095.90557: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883095.96366: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883095.96453: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883095.96508: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883095.96578: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883095.96614: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883095.96722: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883095.96768: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883095.96813: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883095.96872: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883095.96903: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883095.96968: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883095.97009: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883095.97047: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883095.97102: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883095.97132: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883095.97472: variable '__network_packages_default_gobject_packages' from source: role '' defaults 28983 1726883095.97639: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883095.97682: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883095.97717: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883095.97839: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883095.97843: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883095.97925: variable 'ansible_python' from source: facts 28983 1726883095.97952: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 28983 1726883095.98076: variable '__network_wpa_supplicant_required' from source: role '' defaults 28983 1726883095.98216: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 28983 1726883095.98440: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883095.98577: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883095.98609: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883095.98786: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883095.98844: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883095.99020: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883095.99141: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883095.99145: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883095.99206: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883095.99224: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883095.99703: variable 'network_connections' from source: include params 28983 1726883095.99711: variable 'interface' from source: play vars 28983 1726883095.99807: variable 'interface' from source: play vars 28983 1726883096.00167: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883096.00465: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883096.00530: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883096.00591: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883096.00686: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883096.00754: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883096.00793: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883096.00838: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883096.00881: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883096.00946: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883096.01354: variable 'network_connections' from source: include params 28983 1726883096.01368: variable 'interface' from source: play vars 28983 1726883096.01540: variable 'interface' from source: play vars 28983 1726883096.01544: variable '__network_packages_default_wireless' from source: role '' defaults 28983 1726883096.01608: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883096.02032: variable 'network_connections' from source: include params 28983 1726883096.02039: variable 'interface' from source: play vars 28983 1726883096.02131: variable 'interface' from source: play vars 28983 1726883096.02160: variable '__network_packages_default_team' from source: role '' defaults 28983 1726883096.02269: variable '__network_team_connections_defined' from source: role '' defaults 28983 1726883096.02693: variable 'network_connections' from source: include params 28983 1726883096.02741: variable 'interface' from source: play vars 28983 1726883096.02794: variable 'interface' from source: play vars 28983 1726883096.02865: variable '__network_service_name_default_initscripts' from source: role '' defaults 28983 1726883096.02944: variable '__network_service_name_default_initscripts' from source: role '' defaults 28983 1726883096.02952: variable '__network_packages_default_initscripts' from source: role '' defaults 28983 1726883096.03033: variable '__network_packages_default_initscripts' from source: role '' defaults 28983 1726883096.03396: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 28983 1726883096.04087: variable 'network_connections' from source: include params 28983 1726883096.04093: variable 'interface' from source: play vars 28983 1726883096.04170: variable 'interface' from source: play vars 28983 1726883096.04212: variable 'ansible_distribution' from source: facts 28983 1726883096.04216: variable '__network_rh_distros' from source: role '' defaults 28983 1726883096.04218: variable 'ansible_distribution_major_version' from source: facts 28983 1726883096.04221: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 28983 1726883096.04455: variable 'ansible_distribution' from source: facts 28983 1726883096.04458: variable '__network_rh_distros' from source: role '' defaults 28983 1726883096.04487: variable 'ansible_distribution_major_version' from source: facts 28983 1726883096.04491: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 28983 1726883096.04709: variable 'ansible_distribution' from source: facts 28983 1726883096.04713: variable '__network_rh_distros' from source: role '' defaults 28983 1726883096.04751: variable 'ansible_distribution_major_version' from source: facts 28983 1726883096.04764: variable 'network_provider' from source: set_fact 28983 1726883096.04793: variable 'omit' from source: magic vars 28983 1726883096.04832: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883096.04873: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883096.05039: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883096.05049: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883096.05052: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883096.05054: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883096.05057: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883096.05059: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883096.05102: Set connection var ansible_connection to ssh 28983 1726883096.05116: Set connection var ansible_shell_executable to /bin/sh 28983 1726883096.05127: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883096.05143: Set connection var ansible_timeout to 10 28983 1726883096.05151: Set connection var ansible_pipelining to False 28983 1726883096.05154: Set connection var ansible_shell_type to sh 28983 1726883096.05184: variable 'ansible_shell_executable' from source: unknown 28983 1726883096.05187: variable 'ansible_connection' from source: unknown 28983 1726883096.05190: variable 'ansible_module_compression' from source: unknown 28983 1726883096.05195: variable 'ansible_shell_type' from source: unknown 28983 1726883096.05198: variable 'ansible_shell_executable' from source: unknown 28983 1726883096.05291: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883096.05294: variable 'ansible_pipelining' from source: unknown 28983 1726883096.05297: variable 'ansible_timeout' from source: unknown 28983 1726883096.05300: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883096.05351: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883096.05368: variable 'omit' from source: magic vars 28983 1726883096.05376: starting attempt loop 28983 1726883096.05379: running the handler 28983 1726883096.05584: variable 'ansible_facts' from source: unknown 28983 1726883096.07553: _low_level_execute_command(): starting 28983 1726883096.07565: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726883096.08387: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883096.08392: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883096.08456: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883096.08468: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883096.08484: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883096.08501: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883096.08614: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883096.10454: stdout chunk (state=3): >>>/root <<< 28983 1726883096.10591: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883096.10602: stdout chunk (state=3): >>><<< 28983 1726883096.10618: stderr chunk (state=3): >>><<< 28983 1726883096.10647: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883096.10666: _low_level_execute_command(): starting 28983 1726883096.10680: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726883096.1065469-33598-61391355735647 `" && echo ansible-tmp-1726883096.1065469-33598-61391355735647="` echo /root/.ansible/tmp/ansible-tmp-1726883096.1065469-33598-61391355735647 `" ) && sleep 0' 28983 1726883096.11426: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883096.11442: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883096.11510: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883096.11544: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883096.11590: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883096.11601: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883096.11730: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883096.11877: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883096.13884: stdout chunk (state=3): >>>ansible-tmp-1726883096.1065469-33598-61391355735647=/root/.ansible/tmp/ansible-tmp-1726883096.1065469-33598-61391355735647 <<< 28983 1726883096.14011: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883096.14046: stderr chunk (state=3): >>><<< 28983 1726883096.14050: stdout chunk (state=3): >>><<< 28983 1726883096.14066: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726883096.1065469-33598-61391355735647=/root/.ansible/tmp/ansible-tmp-1726883096.1065469-33598-61391355735647 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883096.14095: variable 'ansible_module_compression' from source: unknown 28983 1726883096.14141: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 28983 1726883096.14189: variable 'ansible_facts' from source: unknown 28983 1726883096.14323: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726883096.1065469-33598-61391355735647/AnsiballZ_systemd.py 28983 1726883096.14562: Sending initial data 28983 1726883096.14565: Sent initial data (155 bytes) 28983 1726883096.15251: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883096.15276: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883096.15308: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883096.15326: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883096.15509: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883096.15648: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883096.17270: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726883096.17352: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726883096.17414: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmp0_9w0199 /root/.ansible/tmp/ansible-tmp-1726883096.1065469-33598-61391355735647/AnsiballZ_systemd.py <<< 28983 1726883096.17418: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726883096.1065469-33598-61391355735647/AnsiballZ_systemd.py" <<< 28983 1726883096.17515: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmp0_9w0199" to remote "/root/.ansible/tmp/ansible-tmp-1726883096.1065469-33598-61391355735647/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726883096.1065469-33598-61391355735647/AnsiballZ_systemd.py" <<< 28983 1726883096.20301: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883096.20346: stderr chunk (state=3): >>><<< 28983 1726883096.20360: stdout chunk (state=3): >>><<< 28983 1726883096.20399: done transferring module to remote 28983 1726883096.20429: _low_level_execute_command(): starting 28983 1726883096.20500: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726883096.1065469-33598-61391355735647/ /root/.ansible/tmp/ansible-tmp-1726883096.1065469-33598-61391355735647/AnsiballZ_systemd.py && sleep 0' 28983 1726883096.21168: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883096.21192: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883096.21293: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883096.21344: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883096.21367: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883096.21476: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883096.23508: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883096.23511: stdout chunk (state=3): >>><<< 28983 1726883096.23514: stderr chunk (state=3): >>><<< 28983 1726883096.23627: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883096.23630: _low_level_execute_command(): starting 28983 1726883096.23633: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726883096.1065469-33598-61391355735647/AnsiballZ_systemd.py && sleep 0' 28983 1726883096.24245: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883096.24369: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883096.24423: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883096.24521: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883096.57630: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3307", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ExecMainStartTimestampMonotonic": "237115079", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3307", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4837", "MemoryCurrent": "4534272", "MemoryAvailable": "infinity", "CPUUsageNSec": "1674878000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target NetworkManager-wait-online.service cloud-init.service network.target network.service multi-user.target", "After": "systemd-journald.socket dbus-broker.service dbus.socket system.slice cloud-init-local.service sysinit.target network-pre.target basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:40:53 EDT", "StateChangeTimestampMonotonic": "816797668", "InactiveExitTimestamp": "Fri 2024-09-20 21:31:13 EDT", "InactiveExitTimestampMonotonic": "237115438", "ActiveEnterTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ActiveEnterTimestampMonotonic": "237210316", "ActiveExitTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ActiveExitTimestampMonotonic": "237082173", "InactiveEnterTimestamp": "Fri 2024-09-20 21:31:13 EDT", "InactiveEnterTimestampMonotonic": "237109124", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ConditionTimestampMonotonic": "237109810", "AssertTimestamp": "Fri 2024-09-20 21:31:13 EDT", "AssertTimestampMonotonic": "237109813", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ac5f6302898c463683c4bdf580ab7e3e", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 28983 1726883096.59862: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. <<< 28983 1726883096.59888: stderr chunk (state=3): >>><<< 28983 1726883096.59891: stdout chunk (state=3): >>><<< 28983 1726883096.59907: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3307", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ExecMainStartTimestampMonotonic": "237115079", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3307", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4837", "MemoryCurrent": "4534272", "MemoryAvailable": "infinity", "CPUUsageNSec": "1674878000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target NetworkManager-wait-online.service cloud-init.service network.target network.service multi-user.target", "After": "systemd-journald.socket dbus-broker.service dbus.socket system.slice cloud-init-local.service sysinit.target network-pre.target basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:40:53 EDT", "StateChangeTimestampMonotonic": "816797668", "InactiveExitTimestamp": "Fri 2024-09-20 21:31:13 EDT", "InactiveExitTimestampMonotonic": "237115438", "ActiveEnterTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ActiveEnterTimestampMonotonic": "237210316", "ActiveExitTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ActiveExitTimestampMonotonic": "237082173", "InactiveEnterTimestamp": "Fri 2024-09-20 21:31:13 EDT", "InactiveEnterTimestampMonotonic": "237109124", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ConditionTimestampMonotonic": "237109810", "AssertTimestamp": "Fri 2024-09-20 21:31:13 EDT", "AssertTimestampMonotonic": "237109813", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ac5f6302898c463683c4bdf580ab7e3e", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726883096.60493: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726883096.1065469-33598-61391355735647/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726883096.60642: _low_level_execute_command(): starting 28983 1726883096.60646: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726883096.1065469-33598-61391355735647/ > /dev/null 2>&1 && sleep 0' 28983 1726883096.62056: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883096.62082: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883096.62146: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883096.62212: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883096.62262: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883096.64272: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883096.64374: stderr chunk (state=3): >>><<< 28983 1726883096.64539: stdout chunk (state=3): >>><<< 28983 1726883096.64543: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883096.64545: handler run complete 28983 1726883096.64549: attempt loop complete, returning result 28983 1726883096.64552: _execute() done 28983 1726883096.64555: dumping result to json 28983 1726883096.64557: done dumping result, returning 28983 1726883096.64573: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affe814-3a2d-b16d-c0a7-000000001d37] 28983 1726883096.64582: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001d37 ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28983 1726883096.65085: no more pending results, returning what we have 28983 1726883096.65090: results queue empty 28983 1726883096.65092: checking for any_errors_fatal 28983 1726883096.65102: done checking for any_errors_fatal 28983 1726883096.65103: checking for max_fail_percentage 28983 1726883096.65106: done checking for max_fail_percentage 28983 1726883096.65107: checking to see if all hosts have failed and the running result is not ok 28983 1726883096.65108: done checking to see if all hosts have failed 28983 1726883096.65109: getting the remaining hosts for this loop 28983 1726883096.65112: done getting the remaining hosts for this loop 28983 1726883096.65117: getting the next task for host managed_node2 28983 1726883096.65128: done getting next task for host managed_node2 28983 1726883096.65132: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 28983 1726883096.65141: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883096.65164: getting variables 28983 1726883096.65166: in VariableManager get_vars() 28983 1726883096.65220: Calling all_inventory to load vars for managed_node2 28983 1726883096.65224: Calling groups_inventory to load vars for managed_node2 28983 1726883096.65227: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883096.65460: Calling all_plugins_play to load vars for managed_node2 28983 1726883096.65465: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883096.65474: Calling groups_plugins_play to load vars for managed_node2 28983 1726883096.66044: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001d37 28983 1726883096.66050: WORKER PROCESS EXITING 28983 1726883096.68644: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883096.72096: done with get_vars() 28983 1726883096.72122: done getting variables 28983 1726883096.72185: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:44:56 -0400 (0:00:00.895) 0:02:06.719 ****** 28983 1726883096.72219: entering _queue_task() for managed_node2/service 28983 1726883096.72524: worker is 1 (out of 1 available) 28983 1726883096.72541: exiting _queue_task() for managed_node2/service 28983 1726883096.72556: done queuing things up, now waiting for results queue to drain 28983 1726883096.72558: waiting for pending results... 28983 1726883096.72778: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 28983 1726883096.72896: in run() - task 0affe814-3a2d-b16d-c0a7-000000001d38 28983 1726883096.72911: variable 'ansible_search_path' from source: unknown 28983 1726883096.72914: variable 'ansible_search_path' from source: unknown 28983 1726883096.72950: calling self._execute() 28983 1726883096.73045: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883096.73052: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883096.73062: variable 'omit' from source: magic vars 28983 1726883096.73418: variable 'ansible_distribution_major_version' from source: facts 28983 1726883096.73430: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883096.73566: variable 'network_provider' from source: set_fact 28983 1726883096.73570: Evaluated conditional (network_provider == "nm"): True 28983 1726883096.73661: variable '__network_wpa_supplicant_required' from source: role '' defaults 28983 1726883096.73953: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 28983 1726883096.73984: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883096.77080: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883096.77156: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883096.77195: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883096.77236: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883096.77269: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883096.77375: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883096.77412: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883096.77442: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883096.77491: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883096.77508: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883096.77565: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883096.77593: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883096.77623: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883096.77693: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883096.77705: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883096.77758: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883096.77781: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883096.77804: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883096.77837: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883096.77849: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883096.78163: variable 'network_connections' from source: include params 28983 1726883096.78167: variable 'interface' from source: play vars 28983 1726883096.78170: variable 'interface' from source: play vars 28983 1726883096.78175: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883096.78403: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883096.78416: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883096.78462: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883096.78501: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883096.78599: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883096.78607: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883096.78611: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883096.78647: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883096.78703: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883096.79502: variable 'network_connections' from source: include params 28983 1726883096.79505: variable 'interface' from source: play vars 28983 1726883096.79584: variable 'interface' from source: play vars 28983 1726883096.79617: Evaluated conditional (__network_wpa_supplicant_required): False 28983 1726883096.79620: when evaluation is False, skipping this task 28983 1726883096.79623: _execute() done 28983 1726883096.79702: dumping result to json 28983 1726883096.79705: done dumping result, returning 28983 1726883096.79707: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affe814-3a2d-b16d-c0a7-000000001d38] 28983 1726883096.79718: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001d38 28983 1726883096.79966: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001d38 28983 1726883096.79969: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 28983 1726883096.80025: no more pending results, returning what we have 28983 1726883096.80029: results queue empty 28983 1726883096.80030: checking for any_errors_fatal 28983 1726883096.80056: done checking for any_errors_fatal 28983 1726883096.80057: checking for max_fail_percentage 28983 1726883096.80059: done checking for max_fail_percentage 28983 1726883096.80060: checking to see if all hosts have failed and the running result is not ok 28983 1726883096.80061: done checking to see if all hosts have failed 28983 1726883096.80062: getting the remaining hosts for this loop 28983 1726883096.80064: done getting the remaining hosts for this loop 28983 1726883096.80069: getting the next task for host managed_node2 28983 1726883096.80083: done getting next task for host managed_node2 28983 1726883096.80088: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 28983 1726883096.80094: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883096.80119: getting variables 28983 1726883096.80121: in VariableManager get_vars() 28983 1726883096.80280: Calling all_inventory to load vars for managed_node2 28983 1726883096.80284: Calling groups_inventory to load vars for managed_node2 28983 1726883096.80287: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883096.80298: Calling all_plugins_play to load vars for managed_node2 28983 1726883096.80302: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883096.80307: Calling groups_plugins_play to load vars for managed_node2 28983 1726883096.82514: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883096.85663: done with get_vars() 28983 1726883096.85703: done getting variables 28983 1726883096.85774: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:44:56 -0400 (0:00:00.135) 0:02:06.855 ****** 28983 1726883096.85817: entering _queue_task() for managed_node2/service 28983 1726883096.86160: worker is 1 (out of 1 available) 28983 1726883096.86175: exiting _queue_task() for managed_node2/service 28983 1726883096.86189: done queuing things up, now waiting for results queue to drain 28983 1726883096.86190: waiting for pending results... 28983 1726883096.86564: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 28983 1726883096.86704: in run() - task 0affe814-3a2d-b16d-c0a7-000000001d39 28983 1726883096.86768: variable 'ansible_search_path' from source: unknown 28983 1726883096.86775: variable 'ansible_search_path' from source: unknown 28983 1726883096.86785: calling self._execute() 28983 1726883096.86910: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883096.86924: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883096.86941: variable 'omit' from source: magic vars 28983 1726883096.87422: variable 'ansible_distribution_major_version' from source: facts 28983 1726883096.87425: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883096.87574: variable 'network_provider' from source: set_fact 28983 1726883096.87588: Evaluated conditional (network_provider == "initscripts"): False 28983 1726883096.87597: when evaluation is False, skipping this task 28983 1726883096.87604: _execute() done 28983 1726883096.87639: dumping result to json 28983 1726883096.87642: done dumping result, returning 28983 1726883096.87646: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [0affe814-3a2d-b16d-c0a7-000000001d39] 28983 1726883096.87649: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001d39 skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28983 1726883096.87889: no more pending results, returning what we have 28983 1726883096.87894: results queue empty 28983 1726883096.87895: checking for any_errors_fatal 28983 1726883096.87908: done checking for any_errors_fatal 28983 1726883096.87909: checking for max_fail_percentage 28983 1726883096.87912: done checking for max_fail_percentage 28983 1726883096.87913: checking to see if all hosts have failed and the running result is not ok 28983 1726883096.87914: done checking to see if all hosts have failed 28983 1726883096.87915: getting the remaining hosts for this loop 28983 1726883096.87917: done getting the remaining hosts for this loop 28983 1726883096.87922: getting the next task for host managed_node2 28983 1726883096.87932: done getting next task for host managed_node2 28983 1726883096.87940: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 28983 1726883096.87948: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883096.87984: getting variables 28983 1726883096.87986: in VariableManager get_vars() 28983 1726883096.88249: Calling all_inventory to load vars for managed_node2 28983 1726883096.88253: Calling groups_inventory to load vars for managed_node2 28983 1726883096.88257: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883096.88263: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001d39 28983 1726883096.88266: WORKER PROCESS EXITING 28983 1726883096.88278: Calling all_plugins_play to load vars for managed_node2 28983 1726883096.88282: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883096.88286: Calling groups_plugins_play to load vars for managed_node2 28983 1726883096.92898: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883096.97526: done with get_vars() 28983 1726883096.97598: done getting variables 28983 1726883096.97674: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:44:56 -0400 (0:00:00.118) 0:02:06.974 ****** 28983 1726883096.97718: entering _queue_task() for managed_node2/copy 28983 1726883096.98090: worker is 1 (out of 1 available) 28983 1726883096.98106: exiting _queue_task() for managed_node2/copy 28983 1726883096.98119: done queuing things up, now waiting for results queue to drain 28983 1726883096.98120: waiting for pending results... 28983 1726883096.98421: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 28983 1726883096.98618: in run() - task 0affe814-3a2d-b16d-c0a7-000000001d3a 28983 1726883096.98643: variable 'ansible_search_path' from source: unknown 28983 1726883096.98651: variable 'ansible_search_path' from source: unknown 28983 1726883096.98699: calling self._execute() 28983 1726883096.98822: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883096.98839: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883096.98857: variable 'omit' from source: magic vars 28983 1726883096.99377: variable 'ansible_distribution_major_version' from source: facts 28983 1726883096.99397: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883096.99592: variable 'network_provider' from source: set_fact 28983 1726883096.99629: Evaluated conditional (network_provider == "initscripts"): False 28983 1726883096.99633: when evaluation is False, skipping this task 28983 1726883096.99740: _execute() done 28983 1726883096.99744: dumping result to json 28983 1726883096.99746: done dumping result, returning 28983 1726883096.99750: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affe814-3a2d-b16d-c0a7-000000001d3a] 28983 1726883096.99754: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001d3a 28983 1726883096.99845: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001d3a 28983 1726883096.99849: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 28983 1726883096.99910: no more pending results, returning what we have 28983 1726883096.99916: results queue empty 28983 1726883096.99917: checking for any_errors_fatal 28983 1726883096.99927: done checking for any_errors_fatal 28983 1726883096.99929: checking for max_fail_percentage 28983 1726883096.99931: done checking for max_fail_percentage 28983 1726883096.99932: checking to see if all hosts have failed and the running result is not ok 28983 1726883096.99933: done checking to see if all hosts have failed 28983 1726883096.99936: getting the remaining hosts for this loop 28983 1726883096.99938: done getting the remaining hosts for this loop 28983 1726883096.99943: getting the next task for host managed_node2 28983 1726883096.99955: done getting next task for host managed_node2 28983 1726883096.99961: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 28983 1726883096.99970: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883097.00255: getting variables 28983 1726883097.00257: in VariableManager get_vars() 28983 1726883097.00302: Calling all_inventory to load vars for managed_node2 28983 1726883097.00305: Calling groups_inventory to load vars for managed_node2 28983 1726883097.00308: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883097.00318: Calling all_plugins_play to load vars for managed_node2 28983 1726883097.00322: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883097.00326: Calling groups_plugins_play to load vars for managed_node2 28983 1726883097.03555: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883097.08385: done with get_vars() 28983 1726883097.08421: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:44:57 -0400 (0:00:00.108) 0:02:07.083 ****** 28983 1726883097.08531: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 28983 1726883097.08908: worker is 1 (out of 1 available) 28983 1726883097.08921: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 28983 1726883097.08936: done queuing things up, now waiting for results queue to drain 28983 1726883097.08938: waiting for pending results... 28983 1726883097.09481: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 28983 1726883097.09667: in run() - task 0affe814-3a2d-b16d-c0a7-000000001d3b 28983 1726883097.09674: variable 'ansible_search_path' from source: unknown 28983 1726883097.09678: variable 'ansible_search_path' from source: unknown 28983 1726883097.09727: calling self._execute() 28983 1726883097.09888: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883097.09939: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883097.09944: variable 'omit' from source: magic vars 28983 1726883097.10647: variable 'ansible_distribution_major_version' from source: facts 28983 1726883097.10679: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883097.10702: variable 'omit' from source: magic vars 28983 1726883097.11055: variable 'omit' from source: magic vars 28983 1726883097.11396: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883097.27516: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883097.27614: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883097.27654: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883097.27722: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883097.27737: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883097.27845: variable 'network_provider' from source: set_fact 28983 1726883097.28030: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883097.28069: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883097.28105: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883097.28202: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883097.28206: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883097.28377: variable 'omit' from source: magic vars 28983 1726883097.28445: variable 'omit' from source: magic vars 28983 1726883097.28609: variable 'network_connections' from source: include params 28983 1726883097.28620: variable 'interface' from source: play vars 28983 1726883097.28700: variable 'interface' from source: play vars 28983 1726883097.28895: variable 'omit' from source: magic vars 28983 1726883097.28904: variable '__lsr_ansible_managed' from source: task vars 28983 1726883097.28989: variable '__lsr_ansible_managed' from source: task vars 28983 1726883097.29274: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 28983 1726883097.29550: Loaded config def from plugin (lookup/template) 28983 1726883097.29554: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 28983 1726883097.29611: File lookup term: get_ansible_managed.j2 28983 1726883097.29614: variable 'ansible_search_path' from source: unknown 28983 1726883097.29622: evaluation_path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 28983 1726883097.29639: search_path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 28983 1726883097.29656: variable 'ansible_search_path' from source: unknown 28983 1726883097.46643: variable 'ansible_managed' from source: unknown 28983 1726883097.46903: variable 'omit' from source: magic vars 28983 1726883097.47134: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883097.47141: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883097.47143: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883097.47145: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883097.47342: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883097.47348: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883097.47350: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883097.47353: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883097.47512: Set connection var ansible_connection to ssh 28983 1726883097.47533: Set connection var ansible_shell_executable to /bin/sh 28983 1726883097.47686: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883097.47690: Set connection var ansible_timeout to 10 28983 1726883097.47692: Set connection var ansible_pipelining to False 28983 1726883097.47694: Set connection var ansible_shell_type to sh 28983 1726883097.47720: variable 'ansible_shell_executable' from source: unknown 28983 1726883097.47795: variable 'ansible_connection' from source: unknown 28983 1726883097.47840: variable 'ansible_module_compression' from source: unknown 28983 1726883097.47843: variable 'ansible_shell_type' from source: unknown 28983 1726883097.47845: variable 'ansible_shell_executable' from source: unknown 28983 1726883097.47848: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883097.47850: variable 'ansible_pipelining' from source: unknown 28983 1726883097.47852: variable 'ansible_timeout' from source: unknown 28983 1726883097.47854: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883097.48148: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28983 1726883097.48379: variable 'omit' from source: magic vars 28983 1726883097.48383: starting attempt loop 28983 1726883097.48386: running the handler 28983 1726883097.48388: _low_level_execute_command(): starting 28983 1726883097.48390: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726883097.49659: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883097.49680: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883097.49700: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883097.49752: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883097.49842: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883097.49885: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883097.50217: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883097.51836: stdout chunk (state=3): >>>/root <<< 28983 1726883097.52009: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883097.52022: stdout chunk (state=3): >>><<< 28983 1726883097.52278: stderr chunk (state=3): >>><<< 28983 1726883097.52283: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883097.52286: _low_level_execute_command(): starting 28983 1726883097.52290: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726883097.5217967-33647-277718039977384 `" && echo ansible-tmp-1726883097.5217967-33647-277718039977384="` echo /root/.ansible/tmp/ansible-tmp-1726883097.5217967-33647-277718039977384 `" ) && sleep 0' 28983 1726883097.53843: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883097.53846: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726883097.53849: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 28983 1726883097.53852: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883097.54441: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883097.54631: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883097.54870: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883097.57127: stdout chunk (state=3): >>>ansible-tmp-1726883097.5217967-33647-277718039977384=/root/.ansible/tmp/ansible-tmp-1726883097.5217967-33647-277718039977384 <<< 28983 1726883097.57216: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883097.57219: stdout chunk (state=3): >>><<< 28983 1726883097.57230: stderr chunk (state=3): >>><<< 28983 1726883097.57309: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726883097.5217967-33647-277718039977384=/root/.ansible/tmp/ansible-tmp-1726883097.5217967-33647-277718039977384 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883097.57360: variable 'ansible_module_compression' from source: unknown 28983 1726883097.57630: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 28983 1726883097.57766: variable 'ansible_facts' from source: unknown 28983 1726883097.57982: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726883097.5217967-33647-277718039977384/AnsiballZ_network_connections.py 28983 1726883097.58358: Sending initial data 28983 1726883097.58361: Sent initial data (168 bytes) 28983 1726883097.59488: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883097.59492: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883097.59495: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883097.59497: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883097.59500: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726883097.59506: stderr chunk (state=3): >>>debug2: match not found <<< 28983 1726883097.59543: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883097.59647: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883097.59689: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883097.59787: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883097.61503: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726883097.61620: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726883097.61939: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpjt26yz9a /root/.ansible/tmp/ansible-tmp-1726883097.5217967-33647-277718039977384/AnsiballZ_network_connections.py <<< 28983 1726883097.61943: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726883097.5217967-33647-277718039977384/AnsiballZ_network_connections.py" debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpjt26yz9a" to remote "/root/.ansible/tmp/ansible-tmp-1726883097.5217967-33647-277718039977384/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726883097.5217967-33647-277718039977384/AnsiballZ_network_connections.py" <<< 28983 1726883097.65293: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883097.65510: stderr chunk (state=3): >>><<< 28983 1726883097.65518: stdout chunk (state=3): >>><<< 28983 1726883097.65546: done transferring module to remote 28983 1726883097.65616: _low_level_execute_command(): starting 28983 1726883097.65620: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726883097.5217967-33647-277718039977384/ /root/.ansible/tmp/ansible-tmp-1726883097.5217967-33647-277718039977384/AnsiballZ_network_connections.py && sleep 0' 28983 1726883097.67273: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883097.67277: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883097.67369: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883097.67385: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883097.67394: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883097.67599: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883097.69468: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883097.69684: stderr chunk (state=3): >>><<< 28983 1726883097.69687: stdout chunk (state=3): >>><<< 28983 1726883097.69766: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883097.69770: _low_level_execute_command(): starting 28983 1726883097.69781: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726883097.5217967-33647-277718039977384/AnsiballZ_network_connections.py && sleep 0' 28983 1726883097.71116: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883097.71155: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883097.71532: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883097.71537: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883097.71877: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883098.07517: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[001] #0, state:down persistent_state:absent, 'statebr': no connection matches 'statebr' to delete\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 28983 1726883098.09596: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883098.09600: stderr chunk (state=3): >>>Shared connection to 10.31.46.139 closed. <<< 28983 1726883098.09617: stderr chunk (state=3): >>><<< 28983 1726883098.09620: stdout chunk (state=3): >>><<< 28983 1726883098.09646: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[001] #0, state:down persistent_state:absent, 'statebr': no connection matches 'statebr' to delete\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726883098.09779: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726883097.5217967-33647-277718039977384/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726883098.09784: _low_level_execute_command(): starting 28983 1726883098.09786: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726883097.5217967-33647-277718039977384/ > /dev/null 2>&1 && sleep 0' 28983 1726883098.10450: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883098.10490: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883098.10517: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883098.10521: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883098.10639: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883098.12651: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883098.12693: stderr chunk (state=3): >>><<< 28983 1726883098.12697: stdout chunk (state=3): >>><<< 28983 1726883098.12712: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883098.12720: handler run complete 28983 1726883098.12746: attempt loop complete, returning result 28983 1726883098.12754: _execute() done 28983 1726883098.12757: dumping result to json 28983 1726883098.12759: done dumping result, returning 28983 1726883098.12768: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affe814-3a2d-b16d-c0a7-000000001d3b] 28983 1726883098.12774: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001d3b 28983 1726883098.13131: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001d3b 28983 1726883098.13138: WORKER PROCESS EXITING changed: [managed_node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [001] #0, state:down persistent_state:absent, 'statebr': no connection matches 'statebr' to delete 28983 1726883098.13465: no more pending results, returning what we have 28983 1726883098.13469: results queue empty 28983 1726883098.13470: checking for any_errors_fatal 28983 1726883098.13480: done checking for any_errors_fatal 28983 1726883098.13481: checking for max_fail_percentage 28983 1726883098.13484: done checking for max_fail_percentage 28983 1726883098.13485: checking to see if all hosts have failed and the running result is not ok 28983 1726883098.13486: done checking to see if all hosts have failed 28983 1726883098.13490: getting the remaining hosts for this loop 28983 1726883098.13493: done getting the remaining hosts for this loop 28983 1726883098.13497: getting the next task for host managed_node2 28983 1726883098.13505: done getting next task for host managed_node2 28983 1726883098.13509: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 28983 1726883098.13515: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883098.13528: getting variables 28983 1726883098.13529: in VariableManager get_vars() 28983 1726883098.13693: Calling all_inventory to load vars for managed_node2 28983 1726883098.13696: Calling groups_inventory to load vars for managed_node2 28983 1726883098.13699: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883098.13709: Calling all_plugins_play to load vars for managed_node2 28983 1726883098.13712: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883098.13716: Calling groups_plugins_play to load vars for managed_node2 28983 1726883098.25462: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883098.29513: done with get_vars() 28983 1726883098.29551: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:44:58 -0400 (0:00:01.211) 0:02:08.294 ****** 28983 1726883098.29712: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 28983 1726883098.30302: worker is 1 (out of 1 available) 28983 1726883098.30317: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 28983 1726883098.30331: done queuing things up, now waiting for results queue to drain 28983 1726883098.30372: waiting for pending results... 28983 1726883098.30793: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 28983 1726883098.30992: in run() - task 0affe814-3a2d-b16d-c0a7-000000001d3c 28983 1726883098.31241: variable 'ansible_search_path' from source: unknown 28983 1726883098.31245: variable 'ansible_search_path' from source: unknown 28983 1726883098.31249: calling self._execute() 28983 1726883098.31253: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883098.31255: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883098.31260: variable 'omit' from source: magic vars 28983 1726883098.31765: variable 'ansible_distribution_major_version' from source: facts 28983 1726883098.31782: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883098.31953: variable 'network_state' from source: role '' defaults 28983 1726883098.31966: Evaluated conditional (network_state != {}): False 28983 1726883098.31969: when evaluation is False, skipping this task 28983 1726883098.31973: _execute() done 28983 1726883098.31990: dumping result to json 28983 1726883098.31994: done dumping result, returning 28983 1726883098.32002: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [0affe814-3a2d-b16d-c0a7-000000001d3c] 28983 1726883098.32009: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001d3c 28983 1726883098.32136: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001d3c 28983 1726883098.32139: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28983 1726883098.32210: no more pending results, returning what we have 28983 1726883098.32218: results queue empty 28983 1726883098.32219: checking for any_errors_fatal 28983 1726883098.32242: done checking for any_errors_fatal 28983 1726883098.32244: checking for max_fail_percentage 28983 1726883098.32246: done checking for max_fail_percentage 28983 1726883098.32247: checking to see if all hosts have failed and the running result is not ok 28983 1726883098.32248: done checking to see if all hosts have failed 28983 1726883098.32249: getting the remaining hosts for this loop 28983 1726883098.32251: done getting the remaining hosts for this loop 28983 1726883098.32255: getting the next task for host managed_node2 28983 1726883098.32263: done getting next task for host managed_node2 28983 1726883098.32268: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 28983 1726883098.32274: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883098.32298: getting variables 28983 1726883098.32300: in VariableManager get_vars() 28983 1726883098.32473: Calling all_inventory to load vars for managed_node2 28983 1726883098.32477: Calling groups_inventory to load vars for managed_node2 28983 1726883098.32481: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883098.32491: Calling all_plugins_play to load vars for managed_node2 28983 1726883098.32495: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883098.32499: Calling groups_plugins_play to load vars for managed_node2 28983 1726883098.36030: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883098.40341: done with get_vars() 28983 1726883098.40380: done getting variables 28983 1726883098.40461: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:44:58 -0400 (0:00:00.108) 0:02:08.402 ****** 28983 1726883098.40506: entering _queue_task() for managed_node2/debug 28983 1726883098.40916: worker is 1 (out of 1 available) 28983 1726883098.40929: exiting _queue_task() for managed_node2/debug 28983 1726883098.40995: done queuing things up, now waiting for results queue to drain 28983 1726883098.40997: waiting for pending results... 28983 1726883098.41651: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 28983 1726883098.41728: in run() - task 0affe814-3a2d-b16d-c0a7-000000001d3d 28983 1726883098.41812: variable 'ansible_search_path' from source: unknown 28983 1726883098.41816: variable 'ansible_search_path' from source: unknown 28983 1726883098.41853: calling self._execute() 28983 1726883098.42198: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883098.42205: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883098.42218: variable 'omit' from source: magic vars 28983 1726883098.43088: variable 'ansible_distribution_major_version' from source: facts 28983 1726883098.43100: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883098.43108: variable 'omit' from source: magic vars 28983 1726883098.43315: variable 'omit' from source: magic vars 28983 1726883098.43478: variable 'omit' from source: magic vars 28983 1726883098.43529: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883098.43640: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883098.43664: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883098.43738: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883098.43796: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883098.43909: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883098.43913: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883098.43918: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883098.44441: Set connection var ansible_connection to ssh 28983 1726883098.44444: Set connection var ansible_shell_executable to /bin/sh 28983 1726883098.44447: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883098.44450: Set connection var ansible_timeout to 10 28983 1726883098.44453: Set connection var ansible_pipelining to False 28983 1726883098.44455: Set connection var ansible_shell_type to sh 28983 1726883098.44458: variable 'ansible_shell_executable' from source: unknown 28983 1726883098.44461: variable 'ansible_connection' from source: unknown 28983 1726883098.44464: variable 'ansible_module_compression' from source: unknown 28983 1726883098.44466: variable 'ansible_shell_type' from source: unknown 28983 1726883098.44468: variable 'ansible_shell_executable' from source: unknown 28983 1726883098.44473: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883098.44476: variable 'ansible_pipelining' from source: unknown 28983 1726883098.44478: variable 'ansible_timeout' from source: unknown 28983 1726883098.44481: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883098.44484: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883098.44495: variable 'omit' from source: magic vars 28983 1726883098.44502: starting attempt loop 28983 1726883098.44505: running the handler 28983 1726883098.44684: variable '__network_connections_result' from source: set_fact 28983 1726883098.44939: handler run complete 28983 1726883098.44942: attempt loop complete, returning result 28983 1726883098.44945: _execute() done 28983 1726883098.44947: dumping result to json 28983 1726883098.44949: done dumping result, returning 28983 1726883098.44951: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affe814-3a2d-b16d-c0a7-000000001d3d] 28983 1726883098.44953: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001d3d 28983 1726883098.45024: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001d3d 28983 1726883098.45028: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result.stderr_lines": [ "[001] #0, state:down persistent_state:absent, 'statebr': no connection matches 'statebr' to delete" ] } 28983 1726883098.45117: no more pending results, returning what we have 28983 1726883098.45121: results queue empty 28983 1726883098.45122: checking for any_errors_fatal 28983 1726883098.45252: done checking for any_errors_fatal 28983 1726883098.45254: checking for max_fail_percentage 28983 1726883098.45256: done checking for max_fail_percentage 28983 1726883098.45258: checking to see if all hosts have failed and the running result is not ok 28983 1726883098.45259: done checking to see if all hosts have failed 28983 1726883098.45259: getting the remaining hosts for this loop 28983 1726883098.45261: done getting the remaining hosts for this loop 28983 1726883098.45266: getting the next task for host managed_node2 28983 1726883098.45274: done getting next task for host managed_node2 28983 1726883098.45279: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 28983 1726883098.45286: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883098.45300: getting variables 28983 1726883098.45302: in VariableManager get_vars() 28983 1726883098.45353: Calling all_inventory to load vars for managed_node2 28983 1726883098.45357: Calling groups_inventory to load vars for managed_node2 28983 1726883098.45360: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883098.45370: Calling all_plugins_play to load vars for managed_node2 28983 1726883098.45374: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883098.45378: Calling groups_plugins_play to load vars for managed_node2 28983 1726883098.47792: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883098.52196: done with get_vars() 28983 1726883098.52230: done getting variables 28983 1726883098.52341: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:44:58 -0400 (0:00:00.118) 0:02:08.522 ****** 28983 1726883098.52504: entering _queue_task() for managed_node2/debug 28983 1726883098.53048: worker is 1 (out of 1 available) 28983 1726883098.53059: exiting _queue_task() for managed_node2/debug 28983 1726883098.53072: done queuing things up, now waiting for results queue to drain 28983 1726883098.53074: waiting for pending results... 28983 1726883098.53574: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 28983 1726883098.53580: in run() - task 0affe814-3a2d-b16d-c0a7-000000001d3e 28983 1726883098.53583: variable 'ansible_search_path' from source: unknown 28983 1726883098.53586: variable 'ansible_search_path' from source: unknown 28983 1726883098.53600: calling self._execute() 28983 1726883098.53720: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883098.53726: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883098.53749: variable 'omit' from source: magic vars 28983 1726883098.54228: variable 'ansible_distribution_major_version' from source: facts 28983 1726883098.54440: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883098.54444: variable 'omit' from source: magic vars 28983 1726883098.54447: variable 'omit' from source: magic vars 28983 1726883098.54450: variable 'omit' from source: magic vars 28983 1726883098.54453: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883098.54500: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883098.54523: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883098.54546: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883098.54559: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883098.54609: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883098.54613: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883098.54616: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883098.54747: Set connection var ansible_connection to ssh 28983 1726883098.54760: Set connection var ansible_shell_executable to /bin/sh 28983 1726883098.54775: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883098.54783: Set connection var ansible_timeout to 10 28983 1726883098.54805: Set connection var ansible_pipelining to False 28983 1726883098.54808: Set connection var ansible_shell_type to sh 28983 1726883098.54837: variable 'ansible_shell_executable' from source: unknown 28983 1726883098.54841: variable 'ansible_connection' from source: unknown 28983 1726883098.54844: variable 'ansible_module_compression' from source: unknown 28983 1726883098.54847: variable 'ansible_shell_type' from source: unknown 28983 1726883098.54849: variable 'ansible_shell_executable' from source: unknown 28983 1726883098.54855: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883098.54860: variable 'ansible_pipelining' from source: unknown 28983 1726883098.54863: variable 'ansible_timeout' from source: unknown 28983 1726883098.54870: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883098.55242: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883098.55247: variable 'omit' from source: magic vars 28983 1726883098.55250: starting attempt loop 28983 1726883098.55252: running the handler 28983 1726883098.55254: variable '__network_connections_result' from source: set_fact 28983 1726883098.55639: variable '__network_connections_result' from source: set_fact 28983 1726883098.55702: handler run complete 28983 1726883098.55737: attempt loop complete, returning result 28983 1726883098.55741: _execute() done 28983 1726883098.55743: dumping result to json 28983 1726883098.55752: done dumping result, returning 28983 1726883098.55848: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affe814-3a2d-b16d-c0a7-000000001d3e] 28983 1726883098.55853: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001d3e 28983 1726883098.55976: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001d3e 28983 1726883098.55980: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[001] #0, state:down persistent_state:absent, 'statebr': no connection matches 'statebr' to delete\n", "stderr_lines": [ "[001] #0, state:down persistent_state:absent, 'statebr': no connection matches 'statebr' to delete" ] } } 28983 1726883098.56251: no more pending results, returning what we have 28983 1726883098.56255: results queue empty 28983 1726883098.56256: checking for any_errors_fatal 28983 1726883098.56262: done checking for any_errors_fatal 28983 1726883098.56263: checking for max_fail_percentage 28983 1726883098.56265: done checking for max_fail_percentage 28983 1726883098.56266: checking to see if all hosts have failed and the running result is not ok 28983 1726883098.56267: done checking to see if all hosts have failed 28983 1726883098.56268: getting the remaining hosts for this loop 28983 1726883098.56270: done getting the remaining hosts for this loop 28983 1726883098.56275: getting the next task for host managed_node2 28983 1726883098.56284: done getting next task for host managed_node2 28983 1726883098.56288: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 28983 1726883098.56294: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883098.56307: getting variables 28983 1726883098.56309: in VariableManager get_vars() 28983 1726883098.56558: Calling all_inventory to load vars for managed_node2 28983 1726883098.56563: Calling groups_inventory to load vars for managed_node2 28983 1726883098.56566: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883098.56581: Calling all_plugins_play to load vars for managed_node2 28983 1726883098.56585: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883098.56589: Calling groups_plugins_play to load vars for managed_node2 28983 1726883098.62123: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883098.68769: done with get_vars() 28983 1726883098.68824: done getting variables 28983 1726883098.68907: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:44:58 -0400 (0:00:00.164) 0:02:08.687 ****** 28983 1726883098.68955: entering _queue_task() for managed_node2/debug 28983 1726883098.69375: worker is 1 (out of 1 available) 28983 1726883098.69390: exiting _queue_task() for managed_node2/debug 28983 1726883098.69404: done queuing things up, now waiting for results queue to drain 28983 1726883098.69406: waiting for pending results... 28983 1726883098.69714: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 28983 1726883098.69916: in run() - task 0affe814-3a2d-b16d-c0a7-000000001d3f 28983 1726883098.69943: variable 'ansible_search_path' from source: unknown 28983 1726883098.69954: variable 'ansible_search_path' from source: unknown 28983 1726883098.70010: calling self._execute() 28983 1726883098.70137: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883098.70153: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883098.70177: variable 'omit' from source: magic vars 28983 1726883098.70970: variable 'ansible_distribution_major_version' from source: facts 28983 1726883098.70992: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883098.71161: variable 'network_state' from source: role '' defaults 28983 1726883098.71186: Evaluated conditional (network_state != {}): False 28983 1726883098.71195: when evaluation is False, skipping this task 28983 1726883098.71234: _execute() done 28983 1726883098.71241: dumping result to json 28983 1726883098.71244: done dumping result, returning 28983 1726883098.71247: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affe814-3a2d-b16d-c0a7-000000001d3f] 28983 1726883098.71254: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001d3f 28983 1726883098.71640: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001d3f 28983 1726883098.71645: WORKER PROCESS EXITING skipping: [managed_node2] => { "false_condition": "network_state != {}" } 28983 1726883098.71710: no more pending results, returning what we have 28983 1726883098.71714: results queue empty 28983 1726883098.71715: checking for any_errors_fatal 28983 1726883098.71729: done checking for any_errors_fatal 28983 1726883098.71730: checking for max_fail_percentage 28983 1726883098.71732: done checking for max_fail_percentage 28983 1726883098.71736: checking to see if all hosts have failed and the running result is not ok 28983 1726883098.71737: done checking to see if all hosts have failed 28983 1726883098.71738: getting the remaining hosts for this loop 28983 1726883098.71741: done getting the remaining hosts for this loop 28983 1726883098.71746: getting the next task for host managed_node2 28983 1726883098.71755: done getting next task for host managed_node2 28983 1726883098.71760: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 28983 1726883098.71768: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883098.71805: getting variables 28983 1726883098.71807: in VariableManager get_vars() 28983 1726883098.72051: Calling all_inventory to load vars for managed_node2 28983 1726883098.72059: Calling groups_inventory to load vars for managed_node2 28983 1726883098.72063: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883098.72075: Calling all_plugins_play to load vars for managed_node2 28983 1726883098.72080: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883098.72084: Calling groups_plugins_play to load vars for managed_node2 28983 1726883098.74810: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883098.78897: done with get_vars() 28983 1726883098.78938: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:44:58 -0400 (0:00:00.101) 0:02:08.788 ****** 28983 1726883098.79070: entering _queue_task() for managed_node2/ping 28983 1726883098.79461: worker is 1 (out of 1 available) 28983 1726883098.79475: exiting _queue_task() for managed_node2/ping 28983 1726883098.79488: done queuing things up, now waiting for results queue to drain 28983 1726883098.79489: waiting for pending results... 28983 1726883098.80293: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 28983 1726883098.80716: in run() - task 0affe814-3a2d-b16d-c0a7-000000001d40 28983 1726883098.80720: variable 'ansible_search_path' from source: unknown 28983 1726883098.80723: variable 'ansible_search_path' from source: unknown 28983 1726883098.81140: calling self._execute() 28983 1726883098.81144: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883098.81149: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883098.81152: variable 'omit' from source: magic vars 28983 1726883098.82085: variable 'ansible_distribution_major_version' from source: facts 28983 1726883098.82159: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883098.82175: variable 'omit' from source: magic vars 28983 1726883098.82384: variable 'omit' from source: magic vars 28983 1726883098.82436: variable 'omit' from source: magic vars 28983 1726883098.82639: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883098.82644: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883098.82730: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883098.82762: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883098.82784: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883098.82837: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883098.82849: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883098.82858: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883098.82997: Set connection var ansible_connection to ssh 28983 1726883098.83026: Set connection var ansible_shell_executable to /bin/sh 28983 1726883098.83044: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883098.83061: Set connection var ansible_timeout to 10 28983 1726883098.83215: Set connection var ansible_pipelining to False 28983 1726883098.83224: Set connection var ansible_shell_type to sh 28983 1726883098.83262: variable 'ansible_shell_executable' from source: unknown 28983 1726883098.83274: variable 'ansible_connection' from source: unknown 28983 1726883098.83284: variable 'ansible_module_compression' from source: unknown 28983 1726883098.83294: variable 'ansible_shell_type' from source: unknown 28983 1726883098.83341: variable 'ansible_shell_executable' from source: unknown 28983 1726883098.83345: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883098.83347: variable 'ansible_pipelining' from source: unknown 28983 1726883098.83350: variable 'ansible_timeout' from source: unknown 28983 1726883098.83352: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883098.83626: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28983 1726883098.83648: variable 'omit' from source: magic vars 28983 1726883098.83659: starting attempt loop 28983 1726883098.83674: running the handler 28983 1726883098.83780: _low_level_execute_command(): starting 28983 1726883098.83783: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726883098.84847: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883098.84980: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883098.85202: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883098.85308: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883098.87103: stdout chunk (state=3): >>>/root <<< 28983 1726883098.87331: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883098.87336: stdout chunk (state=3): >>><<< 28983 1726883098.87340: stderr chunk (state=3): >>><<< 28983 1726883098.87485: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883098.87490: _low_level_execute_command(): starting 28983 1726883098.87493: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726883098.8737578-33712-84522118131426 `" && echo ansible-tmp-1726883098.8737578-33712-84522118131426="` echo /root/.ansible/tmp/ansible-tmp-1726883098.8737578-33712-84522118131426 `" ) && sleep 0' 28983 1726883098.88590: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883098.88652: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883098.88674: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883098.88681: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883098.88692: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726883098.88740: stderr chunk (state=3): >>>debug2: match not found <<< 28983 1726883098.88744: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883098.88752: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28983 1726883098.88755: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.46.139 is address <<< 28983 1726883098.88820: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883098.88997: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883098.89148: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883098.91324: stdout chunk (state=3): >>>ansible-tmp-1726883098.8737578-33712-84522118131426=/root/.ansible/tmp/ansible-tmp-1726883098.8737578-33712-84522118131426 <<< 28983 1726883098.91339: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883098.91547: stderr chunk (state=3): >>><<< 28983 1726883098.91551: stdout chunk (state=3): >>><<< 28983 1726883098.91554: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726883098.8737578-33712-84522118131426=/root/.ansible/tmp/ansible-tmp-1726883098.8737578-33712-84522118131426 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883098.91556: variable 'ansible_module_compression' from source: unknown 28983 1726883098.91559: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 28983 1726883098.91561: variable 'ansible_facts' from source: unknown 28983 1726883098.91632: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726883098.8737578-33712-84522118131426/AnsiballZ_ping.py 28983 1726883098.91781: Sending initial data 28983 1726883098.91784: Sent initial data (152 bytes) 28983 1726883098.92426: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883098.92513: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883098.92550: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883098.92651: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883098.94374: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 28983 1726883098.94383: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726883098.94445: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726883098.94517: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpxg90x28n /root/.ansible/tmp/ansible-tmp-1726883098.8737578-33712-84522118131426/AnsiballZ_ping.py <<< 28983 1726883098.94521: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726883098.8737578-33712-84522118131426/AnsiballZ_ping.py" <<< 28983 1726883098.94584: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpxg90x28n" to remote "/root/.ansible/tmp/ansible-tmp-1726883098.8737578-33712-84522118131426/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726883098.8737578-33712-84522118131426/AnsiballZ_ping.py" <<< 28983 1726883098.95468: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883098.95526: stderr chunk (state=3): >>><<< 28983 1726883098.95530: stdout chunk (state=3): >>><<< 28983 1726883098.95549: done transferring module to remote 28983 1726883098.95559: _low_level_execute_command(): starting 28983 1726883098.95564: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726883098.8737578-33712-84522118131426/ /root/.ansible/tmp/ansible-tmp-1726883098.8737578-33712-84522118131426/AnsiballZ_ping.py && sleep 0' 28983 1726883098.96001: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883098.96004: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883098.96008: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found <<< 28983 1726883098.96011: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883098.96056: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883098.96064: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883098.96131: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883098.98027: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883098.98069: stderr chunk (state=3): >>><<< 28983 1726883098.98076: stdout chunk (state=3): >>><<< 28983 1726883098.98091: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883098.98096: _low_level_execute_command(): starting 28983 1726883098.98101: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726883098.8737578-33712-84522118131426/AnsiballZ_ping.py && sleep 0' 28983 1726883098.98498: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883098.98544: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883098.98548: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883098.98550: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration <<< 28983 1726883098.98552: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883098.98555: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883098.98599: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883098.98606: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883098.98680: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883099.15960: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 28983 1726883099.17374: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. <<< 28983 1726883099.17422: stderr chunk (state=3): >>><<< 28983 1726883099.17426: stdout chunk (state=3): >>><<< 28983 1726883099.17446: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726883099.17468: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726883098.8737578-33712-84522118131426/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726883099.17482: _low_level_execute_command(): starting 28983 1726883099.17487: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726883098.8737578-33712-84522118131426/ > /dev/null 2>&1 && sleep 0' 28983 1726883099.17901: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883099.17947: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883099.17951: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726883099.17954: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883099.17956: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found <<< 28983 1726883099.17958: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883099.18001: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883099.18004: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883099.18081: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883099.20020: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883099.20063: stderr chunk (state=3): >>><<< 28983 1726883099.20067: stdout chunk (state=3): >>><<< 28983 1726883099.20082: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883099.20089: handler run complete 28983 1726883099.20110: attempt loop complete, returning result 28983 1726883099.20113: _execute() done 28983 1726883099.20120: dumping result to json 28983 1726883099.20122: done dumping result, returning 28983 1726883099.20132: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affe814-3a2d-b16d-c0a7-000000001d40] 28983 1726883099.20139: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001d40 28983 1726883099.20243: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001d40 28983 1726883099.20246: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "ping": "pong" } 28983 1726883099.20328: no more pending results, returning what we have 28983 1726883099.20332: results queue empty 28983 1726883099.20333: checking for any_errors_fatal 28983 1726883099.20345: done checking for any_errors_fatal 28983 1726883099.20346: checking for max_fail_percentage 28983 1726883099.20347: done checking for max_fail_percentage 28983 1726883099.20349: checking to see if all hosts have failed and the running result is not ok 28983 1726883099.20350: done checking to see if all hosts have failed 28983 1726883099.20350: getting the remaining hosts for this loop 28983 1726883099.20352: done getting the remaining hosts for this loop 28983 1726883099.20358: getting the next task for host managed_node2 28983 1726883099.20370: done getting next task for host managed_node2 28983 1726883099.20375: ^ task is: TASK: meta (role_complete) 28983 1726883099.20381: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883099.20394: getting variables 28983 1726883099.20396: in VariableManager get_vars() 28983 1726883099.20452: Calling all_inventory to load vars for managed_node2 28983 1726883099.20455: Calling groups_inventory to load vars for managed_node2 28983 1726883099.20458: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883099.20467: Calling all_plugins_play to load vars for managed_node2 28983 1726883099.20473: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883099.20477: Calling groups_plugins_play to load vars for managed_node2 28983 1726883099.21784: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883099.23509: done with get_vars() 28983 1726883099.23532: done getting variables 28983 1726883099.23606: done queuing things up, now waiting for results queue to drain 28983 1726883099.23608: results queue empty 28983 1726883099.23608: checking for any_errors_fatal 28983 1726883099.23611: done checking for any_errors_fatal 28983 1726883099.23611: checking for max_fail_percentage 28983 1726883099.23612: done checking for max_fail_percentage 28983 1726883099.23613: checking to see if all hosts have failed and the running result is not ok 28983 1726883099.23613: done checking to see if all hosts have failed 28983 1726883099.23614: getting the remaining hosts for this loop 28983 1726883099.23615: done getting the remaining hosts for this loop 28983 1726883099.23616: getting the next task for host managed_node2 28983 1726883099.23621: done getting next task for host managed_node2 28983 1726883099.23624: ^ task is: TASK: Asserts 28983 1726883099.23625: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883099.23627: getting variables 28983 1726883099.23628: in VariableManager get_vars() 28983 1726883099.23640: Calling all_inventory to load vars for managed_node2 28983 1726883099.23641: Calling groups_inventory to load vars for managed_node2 28983 1726883099.23643: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883099.23647: Calling all_plugins_play to load vars for managed_node2 28983 1726883099.23649: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883099.23651: Calling groups_plugins_play to load vars for managed_node2 28983 1726883099.24755: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883099.26356: done with get_vars() 28983 1726883099.26377: done getting variables TASK [Asserts] ***************************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:36 Friday 20 September 2024 21:44:59 -0400 (0:00:00.473) 0:02:09.262 ****** 28983 1726883099.26441: entering _queue_task() for managed_node2/include_tasks 28983 1726883099.26690: worker is 1 (out of 1 available) 28983 1726883099.26705: exiting _queue_task() for managed_node2/include_tasks 28983 1726883099.26718: done queuing things up, now waiting for results queue to drain 28983 1726883099.26720: waiting for pending results... 28983 1726883099.26928: running TaskExecutor() for managed_node2/TASK: Asserts 28983 1726883099.27021: in run() - task 0affe814-3a2d-b16d-c0a7-000000001749 28983 1726883099.27035: variable 'ansible_search_path' from source: unknown 28983 1726883099.27039: variable 'ansible_search_path' from source: unknown 28983 1726883099.27086: variable 'lsr_assert' from source: include params 28983 1726883099.27275: variable 'lsr_assert' from source: include params 28983 1726883099.27336: variable 'omit' from source: magic vars 28983 1726883099.27461: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883099.27472: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883099.27486: variable 'omit' from source: magic vars 28983 1726883099.27702: variable 'ansible_distribution_major_version' from source: facts 28983 1726883099.27712: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883099.27716: variable 'item' from source: unknown 28983 1726883099.27779: variable 'item' from source: unknown 28983 1726883099.27806: variable 'item' from source: unknown 28983 1726883099.27862: variable 'item' from source: unknown 28983 1726883099.28006: dumping result to json 28983 1726883099.28009: done dumping result, returning 28983 1726883099.28012: done running TaskExecutor() for managed_node2/TASK: Asserts [0affe814-3a2d-b16d-c0a7-000000001749] 28983 1726883099.28014: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001749 28983 1726883099.28093: no more pending results, returning what we have 28983 1726883099.28097: in VariableManager get_vars() 28983 1726883099.28139: Calling all_inventory to load vars for managed_node2 28983 1726883099.28142: Calling groups_inventory to load vars for managed_node2 28983 1726883099.28155: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883099.28162: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001749 28983 1726883099.28164: WORKER PROCESS EXITING 28983 1726883099.28173: Calling all_plugins_play to load vars for managed_node2 28983 1726883099.28177: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883099.28180: Calling groups_plugins_play to load vars for managed_node2 28983 1726883099.29562: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883099.31140: done with get_vars() 28983 1726883099.31162: variable 'ansible_search_path' from source: unknown 28983 1726883099.31164: variable 'ansible_search_path' from source: unknown 28983 1726883099.31194: we have included files to process 28983 1726883099.31195: generating all_blocks data 28983 1726883099.31197: done generating all_blocks data 28983 1726883099.31201: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 28983 1726883099.31202: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 28983 1726883099.31204: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 28983 1726883099.31294: in VariableManager get_vars() 28983 1726883099.31310: done with get_vars() 28983 1726883099.31401: done processing included file 28983 1726883099.31403: iterating over new_blocks loaded from include file 28983 1726883099.31404: in VariableManager get_vars() 28983 1726883099.31416: done with get_vars() 28983 1726883099.31417: filtering new block on tags 28983 1726883099.31446: done filtering new block on tags 28983 1726883099.31448: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml for managed_node2 => (item=tasks/assert_profile_absent.yml) 28983 1726883099.31452: extending task lists for all hosts with included blocks 28983 1726883099.32302: done extending task lists 28983 1726883099.32303: done processing included files 28983 1726883099.32304: results queue empty 28983 1726883099.32304: checking for any_errors_fatal 28983 1726883099.32306: done checking for any_errors_fatal 28983 1726883099.32306: checking for max_fail_percentage 28983 1726883099.32307: done checking for max_fail_percentage 28983 1726883099.32308: checking to see if all hosts have failed and the running result is not ok 28983 1726883099.32308: done checking to see if all hosts have failed 28983 1726883099.32309: getting the remaining hosts for this loop 28983 1726883099.32310: done getting the remaining hosts for this loop 28983 1726883099.32312: getting the next task for host managed_node2 28983 1726883099.32315: done getting next task for host managed_node2 28983 1726883099.32317: ^ task is: TASK: Include the task 'get_profile_stat.yml' 28983 1726883099.32319: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883099.32321: getting variables 28983 1726883099.32322: in VariableManager get_vars() 28983 1726883099.32330: Calling all_inventory to load vars for managed_node2 28983 1726883099.32333: Calling groups_inventory to load vars for managed_node2 28983 1726883099.32337: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883099.32342: Calling all_plugins_play to load vars for managed_node2 28983 1726883099.32345: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883099.32348: Calling groups_plugins_play to load vars for managed_node2 28983 1726883099.33467: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883099.35115: done with get_vars() 28983 1726883099.35138: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:3 Friday 20 September 2024 21:44:59 -0400 (0:00:00.087) 0:02:09.349 ****** 28983 1726883099.35198: entering _queue_task() for managed_node2/include_tasks 28983 1726883099.35437: worker is 1 (out of 1 available) 28983 1726883099.35450: exiting _queue_task() for managed_node2/include_tasks 28983 1726883099.35462: done queuing things up, now waiting for results queue to drain 28983 1726883099.35464: waiting for pending results... 28983 1726883099.35661: running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' 28983 1726883099.35752: in run() - task 0affe814-3a2d-b16d-c0a7-000000001e99 28983 1726883099.35764: variable 'ansible_search_path' from source: unknown 28983 1726883099.35768: variable 'ansible_search_path' from source: unknown 28983 1726883099.35802: calling self._execute() 28983 1726883099.35891: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883099.35898: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883099.35910: variable 'omit' from source: magic vars 28983 1726883099.36232: variable 'ansible_distribution_major_version' from source: facts 28983 1726883099.36250: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883099.36254: _execute() done 28983 1726883099.36259: dumping result to json 28983 1726883099.36262: done dumping result, returning 28983 1726883099.36267: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' [0affe814-3a2d-b16d-c0a7-000000001e99] 28983 1726883099.36276: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001e99 28983 1726883099.36370: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001e99 28983 1726883099.36374: WORKER PROCESS EXITING 28983 1726883099.36405: no more pending results, returning what we have 28983 1726883099.36410: in VariableManager get_vars() 28983 1726883099.36460: Calling all_inventory to load vars for managed_node2 28983 1726883099.36464: Calling groups_inventory to load vars for managed_node2 28983 1726883099.36468: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883099.36478: Calling all_plugins_play to load vars for managed_node2 28983 1726883099.36483: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883099.36487: Calling groups_plugins_play to load vars for managed_node2 28983 1726883099.37738: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883099.39343: done with get_vars() 28983 1726883099.39363: variable 'ansible_search_path' from source: unknown 28983 1726883099.39364: variable 'ansible_search_path' from source: unknown 28983 1726883099.39374: variable 'item' from source: include params 28983 1726883099.39456: variable 'item' from source: include params 28983 1726883099.39488: we have included files to process 28983 1726883099.39489: generating all_blocks data 28983 1726883099.39490: done generating all_blocks data 28983 1726883099.39492: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 28983 1726883099.39493: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 28983 1726883099.39494: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 28983 1726883099.40250: done processing included file 28983 1726883099.40252: iterating over new_blocks loaded from include file 28983 1726883099.40253: in VariableManager get_vars() 28983 1726883099.40267: done with get_vars() 28983 1726883099.40268: filtering new block on tags 28983 1726883099.40324: done filtering new block on tags 28983 1726883099.40326: in VariableManager get_vars() 28983 1726883099.40340: done with get_vars() 28983 1726883099.40341: filtering new block on tags 28983 1726883099.40393: done filtering new block on tags 28983 1726883099.40396: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node2 28983 1726883099.40399: extending task lists for all hosts with included blocks 28983 1726883099.40596: done extending task lists 28983 1726883099.40597: done processing included files 28983 1726883099.40597: results queue empty 28983 1726883099.40598: checking for any_errors_fatal 28983 1726883099.40600: done checking for any_errors_fatal 28983 1726883099.40601: checking for max_fail_percentage 28983 1726883099.40602: done checking for max_fail_percentage 28983 1726883099.40602: checking to see if all hosts have failed and the running result is not ok 28983 1726883099.40603: done checking to see if all hosts have failed 28983 1726883099.40603: getting the remaining hosts for this loop 28983 1726883099.40604: done getting the remaining hosts for this loop 28983 1726883099.40606: getting the next task for host managed_node2 28983 1726883099.40610: done getting next task for host managed_node2 28983 1726883099.40611: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 28983 1726883099.40614: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883099.40616: getting variables 28983 1726883099.40616: in VariableManager get_vars() 28983 1726883099.40624: Calling all_inventory to load vars for managed_node2 28983 1726883099.40626: Calling groups_inventory to load vars for managed_node2 28983 1726883099.40627: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883099.40631: Calling all_plugins_play to load vars for managed_node2 28983 1726883099.40635: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883099.40638: Calling groups_plugins_play to load vars for managed_node2 28983 1726883099.41748: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883099.43343: done with get_vars() 28983 1726883099.43364: done getting variables 28983 1726883099.43401: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 21:44:59 -0400 (0:00:00.082) 0:02:09.432 ****** 28983 1726883099.43424: entering _queue_task() for managed_node2/set_fact 28983 1726883099.43642: worker is 1 (out of 1 available) 28983 1726883099.43654: exiting _queue_task() for managed_node2/set_fact 28983 1726883099.43668: done queuing things up, now waiting for results queue to drain 28983 1726883099.43670: waiting for pending results... 28983 1726883099.43867: running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag 28983 1726883099.43966: in run() - task 0affe814-3a2d-b16d-c0a7-000000001f17 28983 1726883099.43982: variable 'ansible_search_path' from source: unknown 28983 1726883099.43985: variable 'ansible_search_path' from source: unknown 28983 1726883099.44023: calling self._execute() 28983 1726883099.44106: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883099.44119: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883099.44128: variable 'omit' from source: magic vars 28983 1726883099.44437: variable 'ansible_distribution_major_version' from source: facts 28983 1726883099.44449: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883099.44452: variable 'omit' from source: magic vars 28983 1726883099.44502: variable 'omit' from source: magic vars 28983 1726883099.44528: variable 'omit' from source: magic vars 28983 1726883099.44575: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883099.44602: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883099.44621: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883099.44638: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883099.44648: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883099.44681: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883099.44686: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883099.44689: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883099.44770: Set connection var ansible_connection to ssh 28983 1726883099.44782: Set connection var ansible_shell_executable to /bin/sh 28983 1726883099.44792: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883099.44804: Set connection var ansible_timeout to 10 28983 1726883099.44807: Set connection var ansible_pipelining to False 28983 1726883099.44811: Set connection var ansible_shell_type to sh 28983 1726883099.44829: variable 'ansible_shell_executable' from source: unknown 28983 1726883099.44832: variable 'ansible_connection' from source: unknown 28983 1726883099.44836: variable 'ansible_module_compression' from source: unknown 28983 1726883099.44841: variable 'ansible_shell_type' from source: unknown 28983 1726883099.44843: variable 'ansible_shell_executable' from source: unknown 28983 1726883099.44848: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883099.44853: variable 'ansible_pipelining' from source: unknown 28983 1726883099.44856: variable 'ansible_timeout' from source: unknown 28983 1726883099.44862: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883099.44979: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883099.44991: variable 'omit' from source: magic vars 28983 1726883099.44998: starting attempt loop 28983 1726883099.45003: running the handler 28983 1726883099.45018: handler run complete 28983 1726883099.45027: attempt loop complete, returning result 28983 1726883099.45030: _execute() done 28983 1726883099.45035: dumping result to json 28983 1726883099.45040: done dumping result, returning 28983 1726883099.45047: done running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag [0affe814-3a2d-b16d-c0a7-000000001f17] 28983 1726883099.45052: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001f17 28983 1726883099.45146: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001f17 28983 1726883099.45149: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 28983 1726883099.45211: no more pending results, returning what we have 28983 1726883099.45215: results queue empty 28983 1726883099.45216: checking for any_errors_fatal 28983 1726883099.45217: done checking for any_errors_fatal 28983 1726883099.45218: checking for max_fail_percentage 28983 1726883099.45220: done checking for max_fail_percentage 28983 1726883099.45221: checking to see if all hosts have failed and the running result is not ok 28983 1726883099.45222: done checking to see if all hosts have failed 28983 1726883099.45223: getting the remaining hosts for this loop 28983 1726883099.45225: done getting the remaining hosts for this loop 28983 1726883099.45229: getting the next task for host managed_node2 28983 1726883099.45239: done getting next task for host managed_node2 28983 1726883099.45242: ^ task is: TASK: Stat profile file 28983 1726883099.45248: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883099.45252: getting variables 28983 1726883099.45253: in VariableManager get_vars() 28983 1726883099.45292: Calling all_inventory to load vars for managed_node2 28983 1726883099.45295: Calling groups_inventory to load vars for managed_node2 28983 1726883099.45299: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883099.45308: Calling all_plugins_play to load vars for managed_node2 28983 1726883099.45311: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883099.45315: Calling groups_plugins_play to load vars for managed_node2 28983 1726883099.46676: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883099.48291: done with get_vars() 28983 1726883099.48312: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 21:44:59 -0400 (0:00:00.049) 0:02:09.481 ****** 28983 1726883099.48392: entering _queue_task() for managed_node2/stat 28983 1726883099.48626: worker is 1 (out of 1 available) 28983 1726883099.48643: exiting _queue_task() for managed_node2/stat 28983 1726883099.48657: done queuing things up, now waiting for results queue to drain 28983 1726883099.48659: waiting for pending results... 28983 1726883099.48850: running TaskExecutor() for managed_node2/TASK: Stat profile file 28983 1726883099.48942: in run() - task 0affe814-3a2d-b16d-c0a7-000000001f18 28983 1726883099.48955: variable 'ansible_search_path' from source: unknown 28983 1726883099.48960: variable 'ansible_search_path' from source: unknown 28983 1726883099.48992: calling self._execute() 28983 1726883099.49081: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883099.49088: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883099.49100: variable 'omit' from source: magic vars 28983 1726883099.49413: variable 'ansible_distribution_major_version' from source: facts 28983 1726883099.49423: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883099.49431: variable 'omit' from source: magic vars 28983 1726883099.49481: variable 'omit' from source: magic vars 28983 1726883099.49566: variable 'profile' from source: play vars 28983 1726883099.49574: variable 'interface' from source: play vars 28983 1726883099.49624: variable 'interface' from source: play vars 28983 1726883099.49643: variable 'omit' from source: magic vars 28983 1726883099.49684: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883099.49715: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883099.49736: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883099.49751: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883099.49762: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883099.49795: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883099.49798: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883099.49801: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883099.49880: Set connection var ansible_connection to ssh 28983 1726883099.49892: Set connection var ansible_shell_executable to /bin/sh 28983 1726883099.49902: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883099.49911: Set connection var ansible_timeout to 10 28983 1726883099.49917: Set connection var ansible_pipelining to False 28983 1726883099.49920: Set connection var ansible_shell_type to sh 28983 1726883099.49941: variable 'ansible_shell_executable' from source: unknown 28983 1726883099.49944: variable 'ansible_connection' from source: unknown 28983 1726883099.49947: variable 'ansible_module_compression' from source: unknown 28983 1726883099.49950: variable 'ansible_shell_type' from source: unknown 28983 1726883099.49955: variable 'ansible_shell_executable' from source: unknown 28983 1726883099.49959: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883099.49965: variable 'ansible_pipelining' from source: unknown 28983 1726883099.49967: variable 'ansible_timeout' from source: unknown 28983 1726883099.49975: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883099.50148: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28983 1726883099.50158: variable 'omit' from source: magic vars 28983 1726883099.50164: starting attempt loop 28983 1726883099.50167: running the handler 28983 1726883099.50181: _low_level_execute_command(): starting 28983 1726883099.50188: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726883099.50753: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883099.50757: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883099.50760: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883099.50762: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883099.50823: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883099.50827: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883099.50829: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883099.50914: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883099.52678: stdout chunk (state=3): >>>/root <<< 28983 1726883099.52787: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883099.52840: stderr chunk (state=3): >>><<< 28983 1726883099.52846: stdout chunk (state=3): >>><<< 28983 1726883099.52869: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883099.52881: _low_level_execute_command(): starting 28983 1726883099.52887: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726883099.528681-33736-101440372176118 `" && echo ansible-tmp-1726883099.528681-33736-101440372176118="` echo /root/.ansible/tmp/ansible-tmp-1726883099.528681-33736-101440372176118 `" ) && sleep 0' 28983 1726883099.53491: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883099.53507: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883099.53512: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883099.53532: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883099.53622: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883099.53700: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883099.55690: stdout chunk (state=3): >>>ansible-tmp-1726883099.528681-33736-101440372176118=/root/.ansible/tmp/ansible-tmp-1726883099.528681-33736-101440372176118 <<< 28983 1726883099.56039: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883099.56042: stdout chunk (state=3): >>><<< 28983 1726883099.56045: stderr chunk (state=3): >>><<< 28983 1726883099.56047: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726883099.528681-33736-101440372176118=/root/.ansible/tmp/ansible-tmp-1726883099.528681-33736-101440372176118 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883099.56051: variable 'ansible_module_compression' from source: unknown 28983 1726883099.56053: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 28983 1726883099.56086: variable 'ansible_facts' from source: unknown 28983 1726883099.56199: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726883099.528681-33736-101440372176118/AnsiballZ_stat.py 28983 1726883099.56418: Sending initial data 28983 1726883099.56421: Sent initial data (152 bytes) 28983 1726883099.57153: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883099.57197: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883099.57214: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883099.57237: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883099.57333: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883099.58963: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726883099.59059: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726883099.59156: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmp0yo5ksel /root/.ansible/tmp/ansible-tmp-1726883099.528681-33736-101440372176118/AnsiballZ_stat.py <<< 28983 1726883099.59159: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726883099.528681-33736-101440372176118/AnsiballZ_stat.py" <<< 28983 1726883099.59209: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmp0yo5ksel" to remote "/root/.ansible/tmp/ansible-tmp-1726883099.528681-33736-101440372176118/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726883099.528681-33736-101440372176118/AnsiballZ_stat.py" <<< 28983 1726883099.60180: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883099.60227: stderr chunk (state=3): >>><<< 28983 1726883099.60230: stdout chunk (state=3): >>><<< 28983 1726883099.60250: done transferring module to remote 28983 1726883099.60260: _low_level_execute_command(): starting 28983 1726883099.60266: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726883099.528681-33736-101440372176118/ /root/.ansible/tmp/ansible-tmp-1726883099.528681-33736-101440372176118/AnsiballZ_stat.py && sleep 0' 28983 1726883099.60716: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883099.60722: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883099.60724: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration <<< 28983 1726883099.60727: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found <<< 28983 1726883099.60731: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883099.60790: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883099.60792: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883099.60863: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883099.62803: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883099.62807: stdout chunk (state=3): >>><<< 28983 1726883099.63040: stderr chunk (state=3): >>><<< 28983 1726883099.63044: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883099.63047: _low_level_execute_command(): starting 28983 1726883099.63050: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726883099.528681-33736-101440372176118/AnsiballZ_stat.py && sleep 0' 28983 1726883099.63551: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883099.63561: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883099.63575: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883099.63598: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883099.63618: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726883099.63627: stderr chunk (state=3): >>>debug2: match not found <<< 28983 1726883099.63639: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883099.63655: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28983 1726883099.63665: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.46.139 is address <<< 28983 1726883099.63675: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28983 1726883099.63718: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883099.63784: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883099.63798: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883099.63836: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883099.63950: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883099.81028: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 28983 1726883099.82644: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. <<< 28983 1726883099.82648: stdout chunk (state=3): >>><<< 28983 1726883099.82651: stderr chunk (state=3): >>><<< 28983 1726883099.82655: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726883099.82658: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726883099.528681-33736-101440372176118/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726883099.82662: _low_level_execute_command(): starting 28983 1726883099.82664: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726883099.528681-33736-101440372176118/ > /dev/null 2>&1 && sleep 0' 28983 1726883099.83215: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883099.83222: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883099.83231: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883099.83251: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883099.83266: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726883099.83297: stderr chunk (state=3): >>>debug2: match not found <<< 28983 1726883099.83300: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883099.83303: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28983 1726883099.83309: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.46.139 is address <<< 28983 1726883099.83316: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28983 1726883099.83336: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883099.83341: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883099.83406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883099.83409: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726883099.83412: stderr chunk (state=3): >>>debug2: match found <<< 28983 1726883099.83414: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883099.83447: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883099.83460: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883099.83503: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883099.83577: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883099.85693: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883099.85696: stdout chunk (state=3): >>><<< 28983 1726883099.85699: stderr chunk (state=3): >>><<< 28983 1726883099.85702: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883099.85704: handler run complete 28983 1726883099.85706: attempt loop complete, returning result 28983 1726883099.85708: _execute() done 28983 1726883099.85710: dumping result to json 28983 1726883099.85713: done dumping result, returning 28983 1726883099.85715: done running TaskExecutor() for managed_node2/TASK: Stat profile file [0affe814-3a2d-b16d-c0a7-000000001f18] 28983 1726883099.85729: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001f18 28983 1726883099.85984: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001f18 28983 1726883099.85987: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 28983 1726883099.86078: no more pending results, returning what we have 28983 1726883099.86084: results queue empty 28983 1726883099.86085: checking for any_errors_fatal 28983 1726883099.86100: done checking for any_errors_fatal 28983 1726883099.86101: checking for max_fail_percentage 28983 1726883099.86104: done checking for max_fail_percentage 28983 1726883099.86105: checking to see if all hosts have failed and the running result is not ok 28983 1726883099.86106: done checking to see if all hosts have failed 28983 1726883099.86107: getting the remaining hosts for this loop 28983 1726883099.86111: done getting the remaining hosts for this loop 28983 1726883099.86117: getting the next task for host managed_node2 28983 1726883099.86128: done getting next task for host managed_node2 28983 1726883099.86131: ^ task is: TASK: Set NM profile exist flag based on the profile files 28983 1726883099.86284: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883099.86292: getting variables 28983 1726883099.86294: in VariableManager get_vars() 28983 1726883099.86467: Calling all_inventory to load vars for managed_node2 28983 1726883099.86474: Calling groups_inventory to load vars for managed_node2 28983 1726883099.86478: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883099.86490: Calling all_plugins_play to load vars for managed_node2 28983 1726883099.86510: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883099.86574: Calling groups_plugins_play to load vars for managed_node2 28983 1726883099.89074: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883099.92072: done with get_vars() 28983 1726883099.92110: done getting variables 28983 1726883099.92183: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 21:44:59 -0400 (0:00:00.438) 0:02:09.920 ****** 28983 1726883099.92223: entering _queue_task() for managed_node2/set_fact 28983 1726883099.92582: worker is 1 (out of 1 available) 28983 1726883099.92596: exiting _queue_task() for managed_node2/set_fact 28983 1726883099.92611: done queuing things up, now waiting for results queue to drain 28983 1726883099.92612: waiting for pending results... 28983 1726883099.92962: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files 28983 1726883099.93077: in run() - task 0affe814-3a2d-b16d-c0a7-000000001f19 28983 1726883099.93166: variable 'ansible_search_path' from source: unknown 28983 1726883099.93170: variable 'ansible_search_path' from source: unknown 28983 1726883099.93182: calling self._execute() 28983 1726883099.93253: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883099.93260: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883099.93276: variable 'omit' from source: magic vars 28983 1726883099.93713: variable 'ansible_distribution_major_version' from source: facts 28983 1726883099.93725: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883099.93876: variable 'profile_stat' from source: set_fact 28983 1726883099.93941: Evaluated conditional (profile_stat.stat.exists): False 28983 1726883099.93946: when evaluation is False, skipping this task 28983 1726883099.93949: _execute() done 28983 1726883099.93951: dumping result to json 28983 1726883099.93954: done dumping result, returning 28983 1726883099.93956: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files [0affe814-3a2d-b16d-c0a7-000000001f19] 28983 1726883099.93959: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001f19 28983 1726883099.94127: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001f19 28983 1726883099.94132: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 28983 1726883099.94194: no more pending results, returning what we have 28983 1726883099.94200: results queue empty 28983 1726883099.94201: checking for any_errors_fatal 28983 1726883099.94214: done checking for any_errors_fatal 28983 1726883099.94215: checking for max_fail_percentage 28983 1726883099.94217: done checking for max_fail_percentage 28983 1726883099.94219: checking to see if all hosts have failed and the running result is not ok 28983 1726883099.94219: done checking to see if all hosts have failed 28983 1726883099.94220: getting the remaining hosts for this loop 28983 1726883099.94223: done getting the remaining hosts for this loop 28983 1726883099.94229: getting the next task for host managed_node2 28983 1726883099.94242: done getting next task for host managed_node2 28983 1726883099.94246: ^ task is: TASK: Get NM profile info 28983 1726883099.94255: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883099.94261: getting variables 28983 1726883099.94263: in VariableManager get_vars() 28983 1726883099.94319: Calling all_inventory to load vars for managed_node2 28983 1726883099.94323: Calling groups_inventory to load vars for managed_node2 28983 1726883099.94328: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883099.94620: Calling all_plugins_play to load vars for managed_node2 28983 1726883099.94625: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883099.94630: Calling groups_plugins_play to load vars for managed_node2 28983 1726883099.97451: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883100.00600: done with get_vars() 28983 1726883100.00637: done getting variables 28983 1726883100.00704: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 21:45:00 -0400 (0:00:00.088) 0:02:10.008 ****** 28983 1726883100.01030: entering _queue_task() for managed_node2/shell 28983 1726883100.01554: worker is 1 (out of 1 available) 28983 1726883100.01568: exiting _queue_task() for managed_node2/shell 28983 1726883100.01581: done queuing things up, now waiting for results queue to drain 28983 1726883100.01583: waiting for pending results... 28983 1726883100.02125: running TaskExecutor() for managed_node2/TASK: Get NM profile info 28983 1726883100.02342: in run() - task 0affe814-3a2d-b16d-c0a7-000000001f1a 28983 1726883100.02347: variable 'ansible_search_path' from source: unknown 28983 1726883100.02350: variable 'ansible_search_path' from source: unknown 28983 1726883100.02354: calling self._execute() 28983 1726883100.02426: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883100.02431: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883100.02448: variable 'omit' from source: magic vars 28983 1726883100.03406: variable 'ansible_distribution_major_version' from source: facts 28983 1726883100.03419: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883100.03423: variable 'omit' from source: magic vars 28983 1726883100.03601: variable 'omit' from source: magic vars 28983 1726883100.03838: variable 'profile' from source: play vars 28983 1726883100.03914: variable 'interface' from source: play vars 28983 1726883100.03991: variable 'interface' from source: play vars 28983 1726883100.04127: variable 'omit' from source: magic vars 28983 1726883100.04178: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883100.04219: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883100.04359: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883100.04380: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883100.04393: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883100.04430: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883100.04436: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883100.04440: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883100.04779: Set connection var ansible_connection to ssh 28983 1726883100.04792: Set connection var ansible_shell_executable to /bin/sh 28983 1726883100.04803: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883100.04813: Set connection var ansible_timeout to 10 28983 1726883100.04819: Set connection var ansible_pipelining to False 28983 1726883100.04822: Set connection var ansible_shell_type to sh 28983 1726883100.04850: variable 'ansible_shell_executable' from source: unknown 28983 1726883100.04853: variable 'ansible_connection' from source: unknown 28983 1726883100.04856: variable 'ansible_module_compression' from source: unknown 28983 1726883100.04859: variable 'ansible_shell_type' from source: unknown 28983 1726883100.05015: variable 'ansible_shell_executable' from source: unknown 28983 1726883100.05018: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883100.05021: variable 'ansible_pipelining' from source: unknown 28983 1726883100.05024: variable 'ansible_timeout' from source: unknown 28983 1726883100.05026: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883100.05323: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883100.05328: variable 'omit' from source: magic vars 28983 1726883100.05331: starting attempt loop 28983 1726883100.05336: running the handler 28983 1726883100.05340: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883100.05364: _low_level_execute_command(): starting 28983 1726883100.05367: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726883100.06383: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 28983 1726883100.06388: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883100.06458: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883100.06667: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883100.08540: stdout chunk (state=3): >>>/root <<< 28983 1726883100.08651: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883100.08655: stdout chunk (state=3): >>><<< 28983 1726883100.08665: stderr chunk (state=3): >>><<< 28983 1726883100.08839: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883100.08862: _low_level_execute_command(): starting 28983 1726883100.08874: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726883100.0884664-33757-220398984076197 `" && echo ansible-tmp-1726883100.0884664-33757-220398984076197="` echo /root/.ansible/tmp/ansible-tmp-1726883100.0884664-33757-220398984076197 `" ) && sleep 0' 28983 1726883100.09452: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883100.09467: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883100.09482: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883100.09510: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883100.09528: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726883100.09543: stderr chunk (state=3): >>>debug2: match not found <<< 28983 1726883100.09557: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883100.09576: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28983 1726883100.09590: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.46.139 is address <<< 28983 1726883100.09603: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28983 1726883100.09652: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883100.09723: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883100.09743: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883100.09779: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883100.09924: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883100.11966: stdout chunk (state=3): >>>ansible-tmp-1726883100.0884664-33757-220398984076197=/root/.ansible/tmp/ansible-tmp-1726883100.0884664-33757-220398984076197 <<< 28983 1726883100.12439: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883100.12443: stdout chunk (state=3): >>><<< 28983 1726883100.12446: stderr chunk (state=3): >>><<< 28983 1726883100.12449: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726883100.0884664-33757-220398984076197=/root/.ansible/tmp/ansible-tmp-1726883100.0884664-33757-220398984076197 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883100.12451: variable 'ansible_module_compression' from source: unknown 28983 1726883100.12453: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 28983 1726883100.12455: variable 'ansible_facts' from source: unknown 28983 1726883100.12762: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726883100.0884664-33757-220398984076197/AnsiballZ_command.py 28983 1726883100.13462: Sending initial data 28983 1726883100.13466: Sent initial data (156 bytes) 28983 1726883100.14316: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883100.14444: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28983 1726883100.14448: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.46.139 is address <<< 28983 1726883100.14451: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 28983 1726883100.14453: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883100.14455: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883100.14457: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883100.14460: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883100.14768: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883100.14952: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883100.15051: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883100.16752: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726883100.16814: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726883100.16886: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpxjwsk9h2 /root/.ansible/tmp/ansible-tmp-1726883100.0884664-33757-220398984076197/AnsiballZ_command.py <<< 28983 1726883100.16898: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726883100.0884664-33757-220398984076197/AnsiballZ_command.py" <<< 28983 1726883100.16953: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpxjwsk9h2" to remote "/root/.ansible/tmp/ansible-tmp-1726883100.0884664-33757-220398984076197/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726883100.0884664-33757-220398984076197/AnsiballZ_command.py" <<< 28983 1726883100.20020: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883100.20023: stdout chunk (state=3): >>><<< 28983 1726883100.20037: stderr chunk (state=3): >>><<< 28983 1726883100.20060: done transferring module to remote 28983 1726883100.20076: _low_level_execute_command(): starting 28983 1726883100.20080: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726883100.0884664-33757-220398984076197/ /root/.ansible/tmp/ansible-tmp-1726883100.0884664-33757-220398984076197/AnsiballZ_command.py && sleep 0' 28983 1726883100.21556: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726883100.21564: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883100.21588: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28983 1726883100.21699: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883100.21819: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883100.21856: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883100.21927: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883100.23968: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883100.23974: stdout chunk (state=3): >>><<< 28983 1726883100.24040: stderr chunk (state=3): >>><<< 28983 1726883100.24044: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883100.24047: _low_level_execute_command(): starting 28983 1726883100.24051: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726883100.0884664-33757-220398984076197/AnsiballZ_command.py && sleep 0' 28983 1726883100.25075: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883100.25086: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883100.25113: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883100.25131: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883100.25309: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 28983 1726883100.25447: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883100.25559: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883100.44477: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "start": "2024-09-20 21:45:00.425397", "end": "2024-09-20 21:45:00.443538", "delta": "0:00:00.018141", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 28983 1726883100.46073: stderr chunk (state=3): >>>debug2: Received exit status from master 1 <<< 28983 1726883100.46089: stderr chunk (state=3): >>>Shared connection to 10.31.46.139 closed. <<< 28983 1726883100.46190: stderr chunk (state=3): >>><<< 28983 1726883100.46200: stdout chunk (state=3): >>><<< 28983 1726883100.46224: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "start": "2024-09-20 21:45:00.425397", "end": "2024-09-20 21:45:00.443538", "delta": "0:00:00.018141", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.46.139 closed. 28983 1726883100.46292: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726883100.0884664-33757-220398984076197/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726883100.46315: _low_level_execute_command(): starting 28983 1726883100.46326: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726883100.0884664-33757-220398984076197/ > /dev/null 2>&1 && sleep 0' 28983 1726883100.46947: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883100.46963: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883100.46987: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883100.47006: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883100.47106: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883100.47131: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883100.47156: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883100.47181: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883100.47346: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883100.49353: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883100.49424: stderr chunk (state=3): >>><<< 28983 1726883100.49440: stdout chunk (state=3): >>><<< 28983 1726883100.49520: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883100.49538: handler run complete 28983 1726883100.49579: Evaluated conditional (False): False 28983 1726883100.49632: attempt loop complete, returning result 28983 1726883100.49842: _execute() done 28983 1726883100.49845: dumping result to json 28983 1726883100.49847: done dumping result, returning 28983 1726883100.49849: done running TaskExecutor() for managed_node2/TASK: Get NM profile info [0affe814-3a2d-b16d-c0a7-000000001f1a] 28983 1726883100.49852: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001f1a 28983 1726883100.49929: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001f1a 28983 1726883100.49932: WORKER PROCESS EXITING fatal: [managed_node2]: FAILED! => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "delta": "0:00:00.018141", "end": "2024-09-20 21:45:00.443538", "rc": 1, "start": "2024-09-20 21:45:00.425397" } MSG: non-zero return code ...ignoring 28983 1726883100.50043: no more pending results, returning what we have 28983 1726883100.50048: results queue empty 28983 1726883100.50049: checking for any_errors_fatal 28983 1726883100.50058: done checking for any_errors_fatal 28983 1726883100.50059: checking for max_fail_percentage 28983 1726883100.50061: done checking for max_fail_percentage 28983 1726883100.50062: checking to see if all hosts have failed and the running result is not ok 28983 1726883100.50063: done checking to see if all hosts have failed 28983 1726883100.50064: getting the remaining hosts for this loop 28983 1726883100.50067: done getting the remaining hosts for this loop 28983 1726883100.50074: getting the next task for host managed_node2 28983 1726883100.50084: done getting next task for host managed_node2 28983 1726883100.50088: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 28983 1726883100.50096: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883100.50101: getting variables 28983 1726883100.50102: in VariableManager get_vars() 28983 1726883100.50458: Calling all_inventory to load vars for managed_node2 28983 1726883100.50462: Calling groups_inventory to load vars for managed_node2 28983 1726883100.50467: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883100.50481: Calling all_plugins_play to load vars for managed_node2 28983 1726883100.50485: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883100.50489: Calling groups_plugins_play to load vars for managed_node2 28983 1726883100.55284: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883100.58951: done with get_vars() 28983 1726883100.58996: done getting variables 28983 1726883100.59185: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 21:45:00 -0400 (0:00:00.581) 0:02:10.590 ****** 28983 1726883100.59227: entering _queue_task() for managed_node2/set_fact 28983 1726883100.59881: worker is 1 (out of 1 available) 28983 1726883100.59898: exiting _queue_task() for managed_node2/set_fact 28983 1726883100.59912: done queuing things up, now waiting for results queue to drain 28983 1726883100.59914: waiting for pending results... 28983 1726883100.60124: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 28983 1726883100.60229: in run() - task 0affe814-3a2d-b16d-c0a7-000000001f1b 28983 1726883100.60246: variable 'ansible_search_path' from source: unknown 28983 1726883100.60250: variable 'ansible_search_path' from source: unknown 28983 1726883100.60284: calling self._execute() 28983 1726883100.60379: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883100.60387: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883100.60398: variable 'omit' from source: magic vars 28983 1726883100.60734: variable 'ansible_distribution_major_version' from source: facts 28983 1726883100.60749: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883100.60868: variable 'nm_profile_exists' from source: set_fact 28983 1726883100.60882: Evaluated conditional (nm_profile_exists.rc == 0): False 28983 1726883100.60886: when evaluation is False, skipping this task 28983 1726883100.60889: _execute() done 28983 1726883100.60893: dumping result to json 28983 1726883100.60898: done dumping result, returning 28983 1726883100.60909: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0affe814-3a2d-b16d-c0a7-000000001f1b] 28983 1726883100.60912: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001f1b 28983 1726883100.61009: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001f1b 28983 1726883100.61013: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "nm_profile_exists.rc == 0", "skip_reason": "Conditional result was False" } 28983 1726883100.61075: no more pending results, returning what we have 28983 1726883100.61080: results queue empty 28983 1726883100.61081: checking for any_errors_fatal 28983 1726883100.61093: done checking for any_errors_fatal 28983 1726883100.61094: checking for max_fail_percentage 28983 1726883100.61096: done checking for max_fail_percentage 28983 1726883100.61097: checking to see if all hosts have failed and the running result is not ok 28983 1726883100.61098: done checking to see if all hosts have failed 28983 1726883100.61099: getting the remaining hosts for this loop 28983 1726883100.61101: done getting the remaining hosts for this loop 28983 1726883100.61105: getting the next task for host managed_node2 28983 1726883100.61117: done getting next task for host managed_node2 28983 1726883100.61120: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 28983 1726883100.61137: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883100.61143: getting variables 28983 1726883100.61144: in VariableManager get_vars() 28983 1726883100.61185: Calling all_inventory to load vars for managed_node2 28983 1726883100.61189: Calling groups_inventory to load vars for managed_node2 28983 1726883100.61193: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883100.61203: Calling all_plugins_play to load vars for managed_node2 28983 1726883100.61206: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883100.61209: Calling groups_plugins_play to load vars for managed_node2 28983 1726883100.62865: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883100.65823: done with get_vars() 28983 1726883100.65860: done getting variables 28983 1726883100.65925: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 28983 1726883100.66129: variable 'profile' from source: play vars 28983 1726883100.66136: variable 'interface' from source: play vars 28983 1726883100.66206: variable 'interface' from source: play vars TASK [Get the ansible_managed comment in ifcfg-statebr] ************************ task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 21:45:00 -0400 (0:00:00.070) 0:02:10.660 ****** 28983 1726883100.66245: entering _queue_task() for managed_node2/command 28983 1726883100.66970: worker is 1 (out of 1 available) 28983 1726883100.66983: exiting _queue_task() for managed_node2/command 28983 1726883100.66994: done queuing things up, now waiting for results queue to drain 28983 1726883100.66996: waiting for pending results... 28983 1726883100.67557: running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-statebr 28983 1726883100.67600: in run() - task 0affe814-3a2d-b16d-c0a7-000000001f1d 28983 1726883100.67621: variable 'ansible_search_path' from source: unknown 28983 1726883100.67625: variable 'ansible_search_path' from source: unknown 28983 1726883100.67784: calling self._execute() 28983 1726883100.67798: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883100.67806: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883100.67821: variable 'omit' from source: magic vars 28983 1726883100.68299: variable 'ansible_distribution_major_version' from source: facts 28983 1726883100.68312: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883100.68483: variable 'profile_stat' from source: set_fact 28983 1726883100.68503: Evaluated conditional (profile_stat.stat.exists): False 28983 1726883100.68506: when evaluation is False, skipping this task 28983 1726883100.68509: _execute() done 28983 1726883100.68511: dumping result to json 28983 1726883100.68518: done dumping result, returning 28983 1726883100.68525: done running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-statebr [0affe814-3a2d-b16d-c0a7-000000001f1d] 28983 1726883100.68531: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001f1d 28983 1726883100.68638: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001f1d 28983 1726883100.68641: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 28983 1726883100.68704: no more pending results, returning what we have 28983 1726883100.68709: results queue empty 28983 1726883100.68710: checking for any_errors_fatal 28983 1726883100.68721: done checking for any_errors_fatal 28983 1726883100.68722: checking for max_fail_percentage 28983 1726883100.68724: done checking for max_fail_percentage 28983 1726883100.68725: checking to see if all hosts have failed and the running result is not ok 28983 1726883100.68726: done checking to see if all hosts have failed 28983 1726883100.68727: getting the remaining hosts for this loop 28983 1726883100.68730: done getting the remaining hosts for this loop 28983 1726883100.68736: getting the next task for host managed_node2 28983 1726883100.68745: done getting next task for host managed_node2 28983 1726883100.68749: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 28983 1726883100.68755: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883100.68759: getting variables 28983 1726883100.68761: in VariableManager get_vars() 28983 1726883100.68803: Calling all_inventory to load vars for managed_node2 28983 1726883100.68806: Calling groups_inventory to load vars for managed_node2 28983 1726883100.68810: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883100.68821: Calling all_plugins_play to load vars for managed_node2 28983 1726883100.68824: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883100.68828: Calling groups_plugins_play to load vars for managed_node2 28983 1726883100.82545: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883100.85623: done with get_vars() 28983 1726883100.85664: done getting variables 28983 1726883100.85724: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 28983 1726883100.85839: variable 'profile' from source: play vars 28983 1726883100.85843: variable 'interface' from source: play vars 28983 1726883100.85915: variable 'interface' from source: play vars TASK [Verify the ansible_managed comment in ifcfg-statebr] ********************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 21:45:00 -0400 (0:00:00.197) 0:02:10.857 ****** 28983 1726883100.85954: entering _queue_task() for managed_node2/set_fact 28983 1726883100.86340: worker is 1 (out of 1 available) 28983 1726883100.86354: exiting _queue_task() for managed_node2/set_fact 28983 1726883100.86370: done queuing things up, now waiting for results queue to drain 28983 1726883100.86373: waiting for pending results... 28983 1726883100.86759: running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-statebr 28983 1726883100.86862: in run() - task 0affe814-3a2d-b16d-c0a7-000000001f1e 28983 1726883100.86890: variable 'ansible_search_path' from source: unknown 28983 1726883100.86900: variable 'ansible_search_path' from source: unknown 28983 1726883100.86953: calling self._execute() 28983 1726883100.87079: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883100.87095: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883100.87115: variable 'omit' from source: magic vars 28983 1726883100.87563: variable 'ansible_distribution_major_version' from source: facts 28983 1726883100.87607: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883100.87771: variable 'profile_stat' from source: set_fact 28983 1726883100.87792: Evaluated conditional (profile_stat.stat.exists): False 28983 1726883100.87824: when evaluation is False, skipping this task 28983 1726883100.87827: _execute() done 28983 1726883100.87830: dumping result to json 28983 1726883100.87833: done dumping result, returning 28983 1726883100.87839: done running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-statebr [0affe814-3a2d-b16d-c0a7-000000001f1e] 28983 1726883100.87934: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001f1e 28983 1726883100.88010: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001f1e 28983 1726883100.88013: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 28983 1726883100.88075: no more pending results, returning what we have 28983 1726883100.88081: results queue empty 28983 1726883100.88082: checking for any_errors_fatal 28983 1726883100.88092: done checking for any_errors_fatal 28983 1726883100.88094: checking for max_fail_percentage 28983 1726883100.88096: done checking for max_fail_percentage 28983 1726883100.88098: checking to see if all hosts have failed and the running result is not ok 28983 1726883100.88099: done checking to see if all hosts have failed 28983 1726883100.88100: getting the remaining hosts for this loop 28983 1726883100.88103: done getting the remaining hosts for this loop 28983 1726883100.88110: getting the next task for host managed_node2 28983 1726883100.88121: done getting next task for host managed_node2 28983 1726883100.88124: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 28983 1726883100.88132: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883100.88140: getting variables 28983 1726883100.88142: in VariableManager get_vars() 28983 1726883100.88201: Calling all_inventory to load vars for managed_node2 28983 1726883100.88205: Calling groups_inventory to load vars for managed_node2 28983 1726883100.88209: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883100.88225: Calling all_plugins_play to load vars for managed_node2 28983 1726883100.88229: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883100.88436: Calling groups_plugins_play to load vars for managed_node2 28983 1726883100.91080: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883100.93499: done with get_vars() 28983 1726883100.93524: done getting variables 28983 1726883100.93588: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 28983 1726883100.93757: variable 'profile' from source: play vars 28983 1726883100.93762: variable 'interface' from source: play vars 28983 1726883100.93868: variable 'interface' from source: play vars TASK [Get the fingerprint comment in ifcfg-statebr] **************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 21:45:00 -0400 (0:00:00.079) 0:02:10.936 ****** 28983 1726883100.93912: entering _queue_task() for managed_node2/command 28983 1726883100.94314: worker is 1 (out of 1 available) 28983 1726883100.94330: exiting _queue_task() for managed_node2/command 28983 1726883100.94348: done queuing things up, now waiting for results queue to drain 28983 1726883100.94357: waiting for pending results... 28983 1726883100.94767: running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-statebr 28983 1726883100.94894: in run() - task 0affe814-3a2d-b16d-c0a7-000000001f1f 28983 1726883100.94908: variable 'ansible_search_path' from source: unknown 28983 1726883100.94912: variable 'ansible_search_path' from source: unknown 28983 1726883100.94955: calling self._execute() 28983 1726883100.95044: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883100.95053: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883100.95064: variable 'omit' from source: magic vars 28983 1726883100.95404: variable 'ansible_distribution_major_version' from source: facts 28983 1726883100.95415: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883100.95527: variable 'profile_stat' from source: set_fact 28983 1726883100.95540: Evaluated conditional (profile_stat.stat.exists): False 28983 1726883100.95544: when evaluation is False, skipping this task 28983 1726883100.95547: _execute() done 28983 1726883100.95550: dumping result to json 28983 1726883100.95555: done dumping result, returning 28983 1726883100.95562: done running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-statebr [0affe814-3a2d-b16d-c0a7-000000001f1f] 28983 1726883100.95569: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001f1f 28983 1726883100.95666: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001f1f 28983 1726883100.95669: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 28983 1726883100.95731: no more pending results, returning what we have 28983 1726883100.95737: results queue empty 28983 1726883100.95738: checking for any_errors_fatal 28983 1726883100.95746: done checking for any_errors_fatal 28983 1726883100.95747: checking for max_fail_percentage 28983 1726883100.95749: done checking for max_fail_percentage 28983 1726883100.95750: checking to see if all hosts have failed and the running result is not ok 28983 1726883100.95751: done checking to see if all hosts have failed 28983 1726883100.95752: getting the remaining hosts for this loop 28983 1726883100.95754: done getting the remaining hosts for this loop 28983 1726883100.95759: getting the next task for host managed_node2 28983 1726883100.95768: done getting next task for host managed_node2 28983 1726883100.95770: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 28983 1726883100.95776: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883100.95781: getting variables 28983 1726883100.95783: in VariableManager get_vars() 28983 1726883100.95821: Calling all_inventory to load vars for managed_node2 28983 1726883100.95824: Calling groups_inventory to load vars for managed_node2 28983 1726883100.95827: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883100.95843: Calling all_plugins_play to load vars for managed_node2 28983 1726883100.95847: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883100.95851: Calling groups_plugins_play to load vars for managed_node2 28983 1726883100.97242: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883100.99469: done with get_vars() 28983 1726883100.99492: done getting variables 28983 1726883100.99542: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 28983 1726883100.99630: variable 'profile' from source: play vars 28983 1726883100.99637: variable 'interface' from source: play vars 28983 1726883100.99695: variable 'interface' from source: play vars TASK [Verify the fingerprint comment in ifcfg-statebr] ************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 21:45:00 -0400 (0:00:00.058) 0:02:10.995 ****** 28983 1726883100.99722: entering _queue_task() for managed_node2/set_fact 28983 1726883100.99958: worker is 1 (out of 1 available) 28983 1726883100.99973: exiting _queue_task() for managed_node2/set_fact 28983 1726883100.99986: done queuing things up, now waiting for results queue to drain 28983 1726883100.99988: waiting for pending results... 28983 1726883101.00230: running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-statebr 28983 1726883101.00329: in run() - task 0affe814-3a2d-b16d-c0a7-000000001f20 28983 1726883101.00342: variable 'ansible_search_path' from source: unknown 28983 1726883101.00348: variable 'ansible_search_path' from source: unknown 28983 1726883101.00382: calling self._execute() 28983 1726883101.00489: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883101.00496: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883101.00505: variable 'omit' from source: magic vars 28983 1726883101.00906: variable 'ansible_distribution_major_version' from source: facts 28983 1726883101.00910: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883101.01075: variable 'profile_stat' from source: set_fact 28983 1726883101.01088: Evaluated conditional (profile_stat.stat.exists): False 28983 1726883101.01091: when evaluation is False, skipping this task 28983 1726883101.01096: _execute() done 28983 1726883101.01099: dumping result to json 28983 1726883101.01105: done dumping result, returning 28983 1726883101.01119: done running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-statebr [0affe814-3a2d-b16d-c0a7-000000001f20] 28983 1726883101.01123: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001f20 28983 1726883101.01338: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001f20 skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 28983 1726883101.01384: no more pending results, returning what we have 28983 1726883101.01387: results queue empty 28983 1726883101.01388: checking for any_errors_fatal 28983 1726883101.01394: done checking for any_errors_fatal 28983 1726883101.01395: checking for max_fail_percentage 28983 1726883101.01397: done checking for max_fail_percentage 28983 1726883101.01398: checking to see if all hosts have failed and the running result is not ok 28983 1726883101.01399: done checking to see if all hosts have failed 28983 1726883101.01400: getting the remaining hosts for this loop 28983 1726883101.01402: done getting the remaining hosts for this loop 28983 1726883101.01406: getting the next task for host managed_node2 28983 1726883101.01414: done getting next task for host managed_node2 28983 1726883101.01417: ^ task is: TASK: Assert that the profile is absent - '{{ profile }}' 28983 1726883101.01420: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883101.01425: getting variables 28983 1726883101.01426: in VariableManager get_vars() 28983 1726883101.01513: Calling all_inventory to load vars for managed_node2 28983 1726883101.01517: Calling groups_inventory to load vars for managed_node2 28983 1726883101.01520: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883101.01527: WORKER PROCESS EXITING 28983 1726883101.01537: Calling all_plugins_play to load vars for managed_node2 28983 1726883101.01549: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883101.01554: Calling groups_plugins_play to load vars for managed_node2 28983 1726883101.03550: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883101.05131: done with get_vars() 28983 1726883101.05155: done getting variables 28983 1726883101.05203: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 28983 1726883101.05294: variable 'profile' from source: play vars 28983 1726883101.05298: variable 'interface' from source: play vars 28983 1726883101.05354: variable 'interface' from source: play vars TASK [Assert that the profile is absent - 'statebr'] *************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:5 Friday 20 September 2024 21:45:01 -0400 (0:00:00.056) 0:02:11.051 ****** 28983 1726883101.05398: entering _queue_task() for managed_node2/assert 28983 1726883101.05695: worker is 1 (out of 1 available) 28983 1726883101.05710: exiting _queue_task() for managed_node2/assert 28983 1726883101.05724: done queuing things up, now waiting for results queue to drain 28983 1726883101.05726: waiting for pending results... 28983 1726883101.06084: running TaskExecutor() for managed_node2/TASK: Assert that the profile is absent - 'statebr' 28983 1726883101.06184: in run() - task 0affe814-3a2d-b16d-c0a7-000000001e9a 28983 1726883101.06197: variable 'ansible_search_path' from source: unknown 28983 1726883101.06207: variable 'ansible_search_path' from source: unknown 28983 1726883101.06265: calling self._execute() 28983 1726883101.06375: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883101.06379: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883101.06406: variable 'omit' from source: magic vars 28983 1726883101.06851: variable 'ansible_distribution_major_version' from source: facts 28983 1726883101.06859: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883101.06873: variable 'omit' from source: magic vars 28983 1726883101.06978: variable 'omit' from source: magic vars 28983 1726883101.07066: variable 'profile' from source: play vars 28983 1726883101.07074: variable 'interface' from source: play vars 28983 1726883101.07158: variable 'interface' from source: play vars 28983 1726883101.07181: variable 'omit' from source: magic vars 28983 1726883101.07246: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883101.07286: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883101.07312: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883101.07337: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883101.07373: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883101.07407: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883101.07418: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883101.07423: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883101.07542: Set connection var ansible_connection to ssh 28983 1726883101.07556: Set connection var ansible_shell_executable to /bin/sh 28983 1726883101.07611: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883101.07615: Set connection var ansible_timeout to 10 28983 1726883101.07617: Set connection var ansible_pipelining to False 28983 1726883101.07620: Set connection var ansible_shell_type to sh 28983 1726883101.07623: variable 'ansible_shell_executable' from source: unknown 28983 1726883101.07625: variable 'ansible_connection' from source: unknown 28983 1726883101.07627: variable 'ansible_module_compression' from source: unknown 28983 1726883101.07630: variable 'ansible_shell_type' from source: unknown 28983 1726883101.07638: variable 'ansible_shell_executable' from source: unknown 28983 1726883101.07644: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883101.07650: variable 'ansible_pipelining' from source: unknown 28983 1726883101.07653: variable 'ansible_timeout' from source: unknown 28983 1726883101.07722: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883101.07833: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883101.07844: variable 'omit' from source: magic vars 28983 1726883101.07852: starting attempt loop 28983 1726883101.07855: running the handler 28983 1726883101.08006: variable 'lsr_net_profile_exists' from source: set_fact 28983 1726883101.08016: Evaluated conditional (not lsr_net_profile_exists): True 28983 1726883101.08026: handler run complete 28983 1726883101.08042: attempt loop complete, returning result 28983 1726883101.08045: _execute() done 28983 1726883101.08048: dumping result to json 28983 1726883101.08054: done dumping result, returning 28983 1726883101.08060: done running TaskExecutor() for managed_node2/TASK: Assert that the profile is absent - 'statebr' [0affe814-3a2d-b16d-c0a7-000000001e9a] 28983 1726883101.08066: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001e9a 28983 1726883101.08164: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001e9a 28983 1726883101.08167: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 28983 1726883101.08241: no more pending results, returning what we have 28983 1726883101.08244: results queue empty 28983 1726883101.08245: checking for any_errors_fatal 28983 1726883101.08252: done checking for any_errors_fatal 28983 1726883101.08253: checking for max_fail_percentage 28983 1726883101.08255: done checking for max_fail_percentage 28983 1726883101.08256: checking to see if all hosts have failed and the running result is not ok 28983 1726883101.08257: done checking to see if all hosts have failed 28983 1726883101.08258: getting the remaining hosts for this loop 28983 1726883101.08260: done getting the remaining hosts for this loop 28983 1726883101.08264: getting the next task for host managed_node2 28983 1726883101.08307: done getting next task for host managed_node2 28983 1726883101.08317: ^ task is: TASK: Conditional asserts 28983 1726883101.08320: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883101.08325: getting variables 28983 1726883101.08327: in VariableManager get_vars() 28983 1726883101.08363: Calling all_inventory to load vars for managed_node2 28983 1726883101.08367: Calling groups_inventory to load vars for managed_node2 28983 1726883101.08373: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883101.08382: Calling all_plugins_play to load vars for managed_node2 28983 1726883101.08385: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883101.08388: Calling groups_plugins_play to load vars for managed_node2 28983 1726883101.09793: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883101.11821: done with get_vars() 28983 1726883101.11847: done getting variables TASK [Conditional asserts] ***************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:42 Friday 20 September 2024 21:45:01 -0400 (0:00:00.065) 0:02:11.116 ****** 28983 1726883101.11920: entering _queue_task() for managed_node2/include_tasks 28983 1726883101.12145: worker is 1 (out of 1 available) 28983 1726883101.12159: exiting _queue_task() for managed_node2/include_tasks 28983 1726883101.12175: done queuing things up, now waiting for results queue to drain 28983 1726883101.12177: waiting for pending results... 28983 1726883101.12367: running TaskExecutor() for managed_node2/TASK: Conditional asserts 28983 1726883101.12455: in run() - task 0affe814-3a2d-b16d-c0a7-00000000174a 28983 1726883101.12467: variable 'ansible_search_path' from source: unknown 28983 1726883101.12474: variable 'ansible_search_path' from source: unknown 28983 1726883101.12708: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883101.14485: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883101.14536: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883101.14566: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883101.14602: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883101.14625: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883101.14705: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883101.14729: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883101.14752: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883101.14785: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883101.14798: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883101.14894: variable 'lsr_assert_when' from source: include params 28983 1726883101.14989: variable 'network_provider' from source: set_fact 28983 1726883101.15052: variable 'omit' from source: magic vars 28983 1726883101.15155: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883101.15170: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883101.15181: variable 'omit' from source: magic vars 28983 1726883101.15351: variable 'ansible_distribution_major_version' from source: facts 28983 1726883101.15364: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883101.15466: variable 'item' from source: unknown 28983 1726883101.15470: Evaluated conditional (item['condition']): True 28983 1726883101.15539: variable 'item' from source: unknown 28983 1726883101.15566: variable 'item' from source: unknown 28983 1726883101.15624: variable 'item' from source: unknown 28983 1726883101.15788: dumping result to json 28983 1726883101.15792: done dumping result, returning 28983 1726883101.15795: done running TaskExecutor() for managed_node2/TASK: Conditional asserts [0affe814-3a2d-b16d-c0a7-00000000174a] 28983 1726883101.15797: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000174a 28983 1726883101.15878: no more pending results, returning what we have 28983 1726883101.15884: in VariableManager get_vars() 28983 1726883101.15928: Calling all_inventory to load vars for managed_node2 28983 1726883101.15931: Calling groups_inventory to load vars for managed_node2 28983 1726883101.15937: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883101.15951: Calling all_plugins_play to load vars for managed_node2 28983 1726883101.15955: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883101.15959: Calling groups_plugins_play to load vars for managed_node2 28983 1726883101.17144: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000174a 28983 1726883101.17148: WORKER PROCESS EXITING 28983 1726883101.17451: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883101.19781: done with get_vars() 28983 1726883101.19802: variable 'ansible_search_path' from source: unknown 28983 1726883101.19803: variable 'ansible_search_path' from source: unknown 28983 1726883101.19833: we have included files to process 28983 1726883101.19835: generating all_blocks data 28983 1726883101.19838: done generating all_blocks data 28983 1726883101.19844: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 28983 1726883101.19845: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 28983 1726883101.19847: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 28983 1726883101.19930: in VariableManager get_vars() 28983 1726883101.19952: done with get_vars() 28983 1726883101.20044: done processing included file 28983 1726883101.20046: iterating over new_blocks loaded from include file 28983 1726883101.20047: in VariableManager get_vars() 28983 1726883101.20063: done with get_vars() 28983 1726883101.20064: filtering new block on tags 28983 1726883101.20094: done filtering new block on tags 28983 1726883101.20096: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed_node2 => (item={'what': 'tasks/assert_device_absent.yml', 'condition': True}) 28983 1726883101.20100: extending task lists for all hosts with included blocks 28983 1726883101.21064: done extending task lists 28983 1726883101.21065: done processing included files 28983 1726883101.21066: results queue empty 28983 1726883101.21067: checking for any_errors_fatal 28983 1726883101.21070: done checking for any_errors_fatal 28983 1726883101.21072: checking for max_fail_percentage 28983 1726883101.21073: done checking for max_fail_percentage 28983 1726883101.21074: checking to see if all hosts have failed and the running result is not ok 28983 1726883101.21075: done checking to see if all hosts have failed 28983 1726883101.21075: getting the remaining hosts for this loop 28983 1726883101.21076: done getting the remaining hosts for this loop 28983 1726883101.21078: getting the next task for host managed_node2 28983 1726883101.21082: done getting next task for host managed_node2 28983 1726883101.21084: ^ task is: TASK: Include the task 'get_interface_stat.yml' 28983 1726883101.21086: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883101.21092: getting variables 28983 1726883101.21092: in VariableManager get_vars() 28983 1726883101.21101: Calling all_inventory to load vars for managed_node2 28983 1726883101.21103: Calling groups_inventory to load vars for managed_node2 28983 1726883101.21105: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883101.21109: Calling all_plugins_play to load vars for managed_node2 28983 1726883101.21111: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883101.21113: Calling groups_plugins_play to load vars for managed_node2 28983 1726883101.22289: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883101.23866: done with get_vars() 28983 1726883101.23891: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Friday 20 September 2024 21:45:01 -0400 (0:00:00.120) 0:02:11.237 ****** 28983 1726883101.23951: entering _queue_task() for managed_node2/include_tasks 28983 1726883101.24204: worker is 1 (out of 1 available) 28983 1726883101.24220: exiting _queue_task() for managed_node2/include_tasks 28983 1726883101.24235: done queuing things up, now waiting for results queue to drain 28983 1726883101.24237: waiting for pending results... 28983 1726883101.24429: running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' 28983 1726883101.24528: in run() - task 0affe814-3a2d-b16d-c0a7-000000001f59 28983 1726883101.24541: variable 'ansible_search_path' from source: unknown 28983 1726883101.24545: variable 'ansible_search_path' from source: unknown 28983 1726883101.24582: calling self._execute() 28983 1726883101.24669: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883101.24677: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883101.24687: variable 'omit' from source: magic vars 28983 1726883101.25009: variable 'ansible_distribution_major_version' from source: facts 28983 1726883101.25024: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883101.25028: _execute() done 28983 1726883101.25031: dumping result to json 28983 1726883101.25037: done dumping result, returning 28983 1726883101.25044: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' [0affe814-3a2d-b16d-c0a7-000000001f59] 28983 1726883101.25049: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001f59 28983 1726883101.25228: no more pending results, returning what we have 28983 1726883101.25233: in VariableManager get_vars() 28983 1726883101.25283: Calling all_inventory to load vars for managed_node2 28983 1726883101.25286: Calling groups_inventory to load vars for managed_node2 28983 1726883101.25289: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883101.25299: Calling all_plugins_play to load vars for managed_node2 28983 1726883101.25302: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883101.25306: Calling groups_plugins_play to load vars for managed_node2 28983 1726883101.25857: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001f59 28983 1726883101.25860: WORKER PROCESS EXITING 28983 1726883101.26677: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883101.28896: done with get_vars() 28983 1726883101.28916: variable 'ansible_search_path' from source: unknown 28983 1726883101.28917: variable 'ansible_search_path' from source: unknown 28983 1726883101.29028: variable 'item' from source: include params 28983 1726883101.29058: we have included files to process 28983 1726883101.29059: generating all_blocks data 28983 1726883101.29060: done generating all_blocks data 28983 1726883101.29062: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 28983 1726883101.29063: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 28983 1726883101.29064: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 28983 1726883101.29203: done processing included file 28983 1726883101.29204: iterating over new_blocks loaded from include file 28983 1726883101.29205: in VariableManager get_vars() 28983 1726883101.29220: done with get_vars() 28983 1726883101.29222: filtering new block on tags 28983 1726883101.29245: done filtering new block on tags 28983 1726883101.29246: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node2 28983 1726883101.29250: extending task lists for all hosts with included blocks 28983 1726883101.29379: done extending task lists 28983 1726883101.29380: done processing included files 28983 1726883101.29381: results queue empty 28983 1726883101.29382: checking for any_errors_fatal 28983 1726883101.29384: done checking for any_errors_fatal 28983 1726883101.29384: checking for max_fail_percentage 28983 1726883101.29385: done checking for max_fail_percentage 28983 1726883101.29386: checking to see if all hosts have failed and the running result is not ok 28983 1726883101.29386: done checking to see if all hosts have failed 28983 1726883101.29387: getting the remaining hosts for this loop 28983 1726883101.29388: done getting the remaining hosts for this loop 28983 1726883101.29390: getting the next task for host managed_node2 28983 1726883101.29393: done getting next task for host managed_node2 28983 1726883101.29395: ^ task is: TASK: Get stat for interface {{ interface }} 28983 1726883101.29398: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883101.29399: getting variables 28983 1726883101.29400: in VariableManager get_vars() 28983 1726883101.29409: Calling all_inventory to load vars for managed_node2 28983 1726883101.29410: Calling groups_inventory to load vars for managed_node2 28983 1726883101.29412: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883101.29416: Calling all_plugins_play to load vars for managed_node2 28983 1726883101.29418: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883101.29420: Calling groups_plugins_play to load vars for managed_node2 28983 1726883101.30816: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883101.33723: done with get_vars() 28983 1726883101.33758: done getting variables 28983 1726883101.33897: variable 'interface' from source: play vars TASK [Get stat for interface statebr] ****************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:45:01 -0400 (0:00:00.099) 0:02:11.337 ****** 28983 1726883101.33930: entering _queue_task() for managed_node2/stat 28983 1726883101.34247: worker is 1 (out of 1 available) 28983 1726883101.34265: exiting _queue_task() for managed_node2/stat 28983 1726883101.34277: done queuing things up, now waiting for results queue to drain 28983 1726883101.34279: waiting for pending results... 28983 1726883101.34724: running TaskExecutor() for managed_node2/TASK: Get stat for interface statebr 28983 1726883101.35088: in run() - task 0affe814-3a2d-b16d-c0a7-000000001fe8 28983 1726883101.35092: variable 'ansible_search_path' from source: unknown 28983 1726883101.35095: variable 'ansible_search_path' from source: unknown 28983 1726883101.35252: calling self._execute() 28983 1726883101.35360: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883101.35366: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883101.35382: variable 'omit' from source: magic vars 28983 1726883101.36257: variable 'ansible_distribution_major_version' from source: facts 28983 1726883101.36261: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883101.36264: variable 'omit' from source: magic vars 28983 1726883101.36542: variable 'omit' from source: magic vars 28983 1726883101.36663: variable 'interface' from source: play vars 28983 1726883101.36687: variable 'omit' from source: magic vars 28983 1726883101.36737: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883101.36784: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883101.36813: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883101.36836: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883101.36851: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883101.36889: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883101.36893: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883101.36898: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883101.37129: Set connection var ansible_connection to ssh 28983 1726883101.37133: Set connection var ansible_shell_executable to /bin/sh 28983 1726883101.37138: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883101.37140: Set connection var ansible_timeout to 10 28983 1726883101.37143: Set connection var ansible_pipelining to False 28983 1726883101.37146: Set connection var ansible_shell_type to sh 28983 1726883101.37148: variable 'ansible_shell_executable' from source: unknown 28983 1726883101.37150: variable 'ansible_connection' from source: unknown 28983 1726883101.37152: variable 'ansible_module_compression' from source: unknown 28983 1726883101.37154: variable 'ansible_shell_type' from source: unknown 28983 1726883101.37157: variable 'ansible_shell_executable' from source: unknown 28983 1726883101.37159: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883101.37162: variable 'ansible_pipelining' from source: unknown 28983 1726883101.37165: variable 'ansible_timeout' from source: unknown 28983 1726883101.37167: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883101.37377: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28983 1726883101.37390: variable 'omit' from source: magic vars 28983 1726883101.37398: starting attempt loop 28983 1726883101.37402: running the handler 28983 1726883101.37418: _low_level_execute_command(): starting 28983 1726883101.37427: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726883101.38224: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883101.38237: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883101.38256: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883101.38264: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found <<< 28983 1726883101.38326: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883101.38385: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883101.38473: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883101.40270: stdout chunk (state=3): >>>/root <<< 28983 1726883101.40438: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883101.40457: stdout chunk (state=3): >>><<< 28983 1726883101.40474: stderr chunk (state=3): >>><<< 28983 1726883101.40581: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883101.40585: _low_level_execute_command(): starting 28983 1726883101.40588: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726883101.4050531-33822-217193276612689 `" && echo ansible-tmp-1726883101.4050531-33822-217193276612689="` echo /root/.ansible/tmp/ansible-tmp-1726883101.4050531-33822-217193276612689 `" ) && sleep 0' 28983 1726883101.41167: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883101.41182: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883101.41198: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883101.41218: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883101.41349: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726883101.41361: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883101.41382: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883101.41482: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883101.43544: stdout chunk (state=3): >>>ansible-tmp-1726883101.4050531-33822-217193276612689=/root/.ansible/tmp/ansible-tmp-1726883101.4050531-33822-217193276612689 <<< 28983 1726883101.43740: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883101.43743: stdout chunk (state=3): >>><<< 28983 1726883101.43746: stderr chunk (state=3): >>><<< 28983 1726883101.43767: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726883101.4050531-33822-217193276612689=/root/.ansible/tmp/ansible-tmp-1726883101.4050531-33822-217193276612689 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883101.43939: variable 'ansible_module_compression' from source: unknown 28983 1726883101.43942: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 28983 1726883101.43945: variable 'ansible_facts' from source: unknown 28983 1726883101.44019: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726883101.4050531-33822-217193276612689/AnsiballZ_stat.py 28983 1726883101.44189: Sending initial data 28983 1726883101.44199: Sent initial data (153 bytes) 28983 1726883101.44841: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883101.44854: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883101.44957: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883101.44987: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883101.45013: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883101.45028: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883101.45130: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883101.46830: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726883101.46927: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726883101.46996: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmp8jtmvpkp /root/.ansible/tmp/ansible-tmp-1726883101.4050531-33822-217193276612689/AnsiballZ_stat.py <<< 28983 1726883101.47008: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726883101.4050531-33822-217193276612689/AnsiballZ_stat.py" <<< 28983 1726883101.47072: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmp8jtmvpkp" to remote "/root/.ansible/tmp/ansible-tmp-1726883101.4050531-33822-217193276612689/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726883101.4050531-33822-217193276612689/AnsiballZ_stat.py" <<< 28983 1726883101.48367: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883101.48379: stderr chunk (state=3): >>><<< 28983 1726883101.48388: stdout chunk (state=3): >>><<< 28983 1726883101.48418: done transferring module to remote 28983 1726883101.48522: _low_level_execute_command(): starting 28983 1726883101.48526: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726883101.4050531-33822-217193276612689/ /root/.ansible/tmp/ansible-tmp-1726883101.4050531-33822-217193276612689/AnsiballZ_stat.py && sleep 0' 28983 1726883101.49104: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883101.49119: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883101.49135: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883101.49154: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883101.49189: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address <<< 28983 1726883101.49296: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883101.49324: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883101.49424: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883101.51466: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883101.51470: stdout chunk (state=3): >>><<< 28983 1726883101.51472: stderr chunk (state=3): >>><<< 28983 1726883101.51491: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883101.51500: _low_level_execute_command(): starting 28983 1726883101.51586: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726883101.4050531-33822-217193276612689/AnsiballZ_stat.py && sleep 0' 28983 1726883101.52163: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883101.52255: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883101.52289: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883101.52305: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883101.52324: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883101.52431: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883101.69729: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 28983 1726883101.71117: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. <<< 28983 1726883101.71170: stderr chunk (state=3): >>><<< 28983 1726883101.71174: stdout chunk (state=3): >>><<< 28983 1726883101.71192: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726883101.71219: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726883101.4050531-33822-217193276612689/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726883101.71232: _low_level_execute_command(): starting 28983 1726883101.71240: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726883101.4050531-33822-217193276612689/ > /dev/null 2>&1 && sleep 0' 28983 1726883101.71694: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883101.71698: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883101.71701: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883101.71703: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883101.71752: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883101.71759: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883101.71828: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883101.73773: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883101.73814: stderr chunk (state=3): >>><<< 28983 1726883101.73818: stdout chunk (state=3): >>><<< 28983 1726883101.73833: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883101.73842: handler run complete 28983 1726883101.73863: attempt loop complete, returning result 28983 1726883101.73866: _execute() done 28983 1726883101.73869: dumping result to json 28983 1726883101.73875: done dumping result, returning 28983 1726883101.73883: done running TaskExecutor() for managed_node2/TASK: Get stat for interface statebr [0affe814-3a2d-b16d-c0a7-000000001fe8] 28983 1726883101.73888: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001fe8 28983 1726883101.73994: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001fe8 28983 1726883101.73996: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 28983 1726883101.74067: no more pending results, returning what we have 28983 1726883101.74073: results queue empty 28983 1726883101.74075: checking for any_errors_fatal 28983 1726883101.74077: done checking for any_errors_fatal 28983 1726883101.74078: checking for max_fail_percentage 28983 1726883101.74081: done checking for max_fail_percentage 28983 1726883101.74084: checking to see if all hosts have failed and the running result is not ok 28983 1726883101.74085: done checking to see if all hosts have failed 28983 1726883101.74086: getting the remaining hosts for this loop 28983 1726883101.74088: done getting the remaining hosts for this loop 28983 1726883101.74094: getting the next task for host managed_node2 28983 1726883101.74104: done getting next task for host managed_node2 28983 1726883101.74112: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 28983 1726883101.74117: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883101.74122: getting variables 28983 1726883101.74123: in VariableManager get_vars() 28983 1726883101.74176: Calling all_inventory to load vars for managed_node2 28983 1726883101.74180: Calling groups_inventory to load vars for managed_node2 28983 1726883101.74184: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883101.74195: Calling all_plugins_play to load vars for managed_node2 28983 1726883101.74198: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883101.74202: Calling groups_plugins_play to load vars for managed_node2 28983 1726883101.75612: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883101.78144: done with get_vars() 28983 1726883101.78167: done getting variables 28983 1726883101.78219: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 28983 1726883101.78318: variable 'interface' from source: play vars TASK [Assert that the interface is absent - 'statebr'] ************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Friday 20 September 2024 21:45:01 -0400 (0:00:00.444) 0:02:11.781 ****** 28983 1726883101.78347: entering _queue_task() for managed_node2/assert 28983 1726883101.78585: worker is 1 (out of 1 available) 28983 1726883101.78599: exiting _queue_task() for managed_node2/assert 28983 1726883101.78613: done queuing things up, now waiting for results queue to drain 28983 1726883101.78615: waiting for pending results... 28983 1726883101.78811: running TaskExecutor() for managed_node2/TASK: Assert that the interface is absent - 'statebr' 28983 1726883101.78910: in run() - task 0affe814-3a2d-b16d-c0a7-000000001f5a 28983 1726883101.78923: variable 'ansible_search_path' from source: unknown 28983 1726883101.78928: variable 'ansible_search_path' from source: unknown 28983 1726883101.78963: calling self._execute() 28983 1726883101.79052: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883101.79057: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883101.79074: variable 'omit' from source: magic vars 28983 1726883101.79385: variable 'ansible_distribution_major_version' from source: facts 28983 1726883101.79396: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883101.79404: variable 'omit' from source: magic vars 28983 1726883101.79443: variable 'omit' from source: magic vars 28983 1726883101.79525: variable 'interface' from source: play vars 28983 1726883101.79542: variable 'omit' from source: magic vars 28983 1726883101.79580: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883101.79614: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883101.79631: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883101.79649: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883101.79660: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883101.79690: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883101.79694: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883101.79699: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883101.79782: Set connection var ansible_connection to ssh 28983 1726883101.79792: Set connection var ansible_shell_executable to /bin/sh 28983 1726883101.79800: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883101.79809: Set connection var ansible_timeout to 10 28983 1726883101.79815: Set connection var ansible_pipelining to False 28983 1726883101.79819: Set connection var ansible_shell_type to sh 28983 1726883101.79843: variable 'ansible_shell_executable' from source: unknown 28983 1726883101.79847: variable 'ansible_connection' from source: unknown 28983 1726883101.79849: variable 'ansible_module_compression' from source: unknown 28983 1726883101.79854: variable 'ansible_shell_type' from source: unknown 28983 1726883101.79856: variable 'ansible_shell_executable' from source: unknown 28983 1726883101.79861: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883101.79866: variable 'ansible_pipelining' from source: unknown 28983 1726883101.79869: variable 'ansible_timeout' from source: unknown 28983 1726883101.79874: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883101.79990: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883101.80002: variable 'omit' from source: magic vars 28983 1726883101.80008: starting attempt loop 28983 1726883101.80011: running the handler 28983 1726883101.80137: variable 'interface_stat' from source: set_fact 28983 1726883101.80146: Evaluated conditional (not interface_stat.stat.exists): True 28983 1726883101.80155: handler run complete 28983 1726883101.80169: attempt loop complete, returning result 28983 1726883101.80175: _execute() done 28983 1726883101.80178: dumping result to json 28983 1726883101.80181: done dumping result, returning 28983 1726883101.80188: done running TaskExecutor() for managed_node2/TASK: Assert that the interface is absent - 'statebr' [0affe814-3a2d-b16d-c0a7-000000001f5a] 28983 1726883101.80194: sending task result for task 0affe814-3a2d-b16d-c0a7-000000001f5a 28983 1726883101.80291: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000001f5a 28983 1726883101.80294: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 28983 1726883101.80354: no more pending results, returning what we have 28983 1726883101.80357: results queue empty 28983 1726883101.80358: checking for any_errors_fatal 28983 1726883101.80365: done checking for any_errors_fatal 28983 1726883101.80366: checking for max_fail_percentage 28983 1726883101.80368: done checking for max_fail_percentage 28983 1726883101.80369: checking to see if all hosts have failed and the running result is not ok 28983 1726883101.80372: done checking to see if all hosts have failed 28983 1726883101.80374: getting the remaining hosts for this loop 28983 1726883101.80375: done getting the remaining hosts for this loop 28983 1726883101.80381: getting the next task for host managed_node2 28983 1726883101.80389: done getting next task for host managed_node2 28983 1726883101.80392: ^ task is: TASK: Success in test '{{ lsr_description }}' 28983 1726883101.80396: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883101.80400: getting variables 28983 1726883101.80401: in VariableManager get_vars() 28983 1726883101.80445: Calling all_inventory to load vars for managed_node2 28983 1726883101.80448: Calling groups_inventory to load vars for managed_node2 28983 1726883101.80451: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883101.80460: Calling all_plugins_play to load vars for managed_node2 28983 1726883101.80464: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883101.80467: Calling groups_plugins_play to load vars for managed_node2 28983 1726883101.81791: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883101.83376: done with get_vars() 28983 1726883101.83398: done getting variables 28983 1726883101.83449: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 28983 1726883101.83539: variable 'lsr_description' from source: include params TASK [Success in test 'I can take a profile down that is absent'] ************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:47 Friday 20 September 2024 21:45:01 -0400 (0:00:00.052) 0:02:11.833 ****** 28983 1726883101.83574: entering _queue_task() for managed_node2/debug 28983 1726883101.83888: worker is 1 (out of 1 available) 28983 1726883101.83903: exiting _queue_task() for managed_node2/debug 28983 1726883101.83918: done queuing things up, now waiting for results queue to drain 28983 1726883101.83920: waiting for pending results... 28983 1726883101.84352: running TaskExecutor() for managed_node2/TASK: Success in test 'I can take a profile down that is absent' 28983 1726883101.84420: in run() - task 0affe814-3a2d-b16d-c0a7-00000000174b 28983 1726883101.84448: variable 'ansible_search_path' from source: unknown 28983 1726883101.84458: variable 'ansible_search_path' from source: unknown 28983 1726883101.84514: calling self._execute() 28983 1726883101.84641: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883101.84840: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883101.84844: variable 'omit' from source: magic vars 28983 1726883101.85146: variable 'ansible_distribution_major_version' from source: facts 28983 1726883101.85166: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883101.85187: variable 'omit' from source: magic vars 28983 1726883101.85238: variable 'omit' from source: magic vars 28983 1726883101.85377: variable 'lsr_description' from source: include params 28983 1726883101.85411: variable 'omit' from source: magic vars 28983 1726883101.85467: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883101.85523: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883101.85557: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883101.85588: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883101.85607: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883101.85659: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883101.85670: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883101.85686: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883101.85824: Set connection var ansible_connection to ssh 28983 1726883101.85852: Set connection var ansible_shell_executable to /bin/sh 28983 1726883101.85874: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883101.85895: Set connection var ansible_timeout to 10 28983 1726883101.85942: Set connection var ansible_pipelining to False 28983 1726883101.85946: Set connection var ansible_shell_type to sh 28983 1726883101.85955: variable 'ansible_shell_executable' from source: unknown 28983 1726883101.85966: variable 'ansible_connection' from source: unknown 28983 1726883101.85980: variable 'ansible_module_compression' from source: unknown 28983 1726883101.85990: variable 'ansible_shell_type' from source: unknown 28983 1726883101.86040: variable 'ansible_shell_executable' from source: unknown 28983 1726883101.86043: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883101.86052: variable 'ansible_pipelining' from source: unknown 28983 1726883101.86055: variable 'ansible_timeout' from source: unknown 28983 1726883101.86057: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883101.86229: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883101.86254: variable 'omit' from source: magic vars 28983 1726883101.86276: starting attempt loop 28983 1726883101.86381: running the handler 28983 1726883101.86385: handler run complete 28983 1726883101.86388: attempt loop complete, returning result 28983 1726883101.86391: _execute() done 28983 1726883101.86393: dumping result to json 28983 1726883101.86399: done dumping result, returning 28983 1726883101.86414: done running TaskExecutor() for managed_node2/TASK: Success in test 'I can take a profile down that is absent' [0affe814-3a2d-b16d-c0a7-00000000174b] 28983 1726883101.86426: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000174b 28983 1726883101.86741: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000174b 28983 1726883101.86745: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: +++++ Success in test 'I can take a profile down that is absent' +++++ 28983 1726883101.86801: no more pending results, returning what we have 28983 1726883101.86804: results queue empty 28983 1726883101.86805: checking for any_errors_fatal 28983 1726883101.86812: done checking for any_errors_fatal 28983 1726883101.86814: checking for max_fail_percentage 28983 1726883101.86816: done checking for max_fail_percentage 28983 1726883101.86817: checking to see if all hosts have failed and the running result is not ok 28983 1726883101.86818: done checking to see if all hosts have failed 28983 1726883101.86819: getting the remaining hosts for this loop 28983 1726883101.86820: done getting the remaining hosts for this loop 28983 1726883101.86825: getting the next task for host managed_node2 28983 1726883101.86835: done getting next task for host managed_node2 28983 1726883101.86840: ^ task is: TASK: Cleanup 28983 1726883101.86844: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883101.86849: getting variables 28983 1726883101.86851: in VariableManager get_vars() 28983 1726883101.86900: Calling all_inventory to load vars for managed_node2 28983 1726883101.86904: Calling groups_inventory to load vars for managed_node2 28983 1726883101.86908: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883101.86920: Calling all_plugins_play to load vars for managed_node2 28983 1726883101.86924: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883101.86928: Calling groups_plugins_play to load vars for managed_node2 28983 1726883101.89429: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883101.92845: done with get_vars() 28983 1726883101.92897: done getting variables TASK [Cleanup] ***************************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:66 Friday 20 September 2024 21:45:01 -0400 (0:00:00.094) 0:02:11.927 ****** 28983 1726883101.93023: entering _queue_task() for managed_node2/include_tasks 28983 1726883101.93561: worker is 1 (out of 1 available) 28983 1726883101.93573: exiting _queue_task() for managed_node2/include_tasks 28983 1726883101.93585: done queuing things up, now waiting for results queue to drain 28983 1726883101.93587: waiting for pending results... 28983 1726883101.93807: running TaskExecutor() for managed_node2/TASK: Cleanup 28983 1726883101.93943: in run() - task 0affe814-3a2d-b16d-c0a7-00000000174f 28983 1726883101.93981: variable 'ansible_search_path' from source: unknown 28983 1726883101.93985: variable 'ansible_search_path' from source: unknown 28983 1726883101.94021: variable 'lsr_cleanup' from source: include params 28983 1726883101.94280: variable 'lsr_cleanup' from source: include params 28983 1726883101.94380: variable 'omit' from source: magic vars 28983 1726883101.94559: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883101.94573: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883101.94639: variable 'omit' from source: magic vars 28983 1726883101.94925: variable 'ansible_distribution_major_version' from source: facts 28983 1726883101.94938: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883101.94947: variable 'item' from source: unknown 28983 1726883101.95036: variable 'item' from source: unknown 28983 1726883101.95079: variable 'item' from source: unknown 28983 1726883101.95159: variable 'item' from source: unknown 28983 1726883101.95539: dumping result to json 28983 1726883101.95544: done dumping result, returning 28983 1726883101.95547: done running TaskExecutor() for managed_node2/TASK: Cleanup [0affe814-3a2d-b16d-c0a7-00000000174f] 28983 1726883101.95549: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000174f 28983 1726883101.95810: no more pending results, returning what we have 28983 1726883101.95815: in VariableManager get_vars() 28983 1726883101.95861: Calling all_inventory to load vars for managed_node2 28983 1726883101.95864: Calling groups_inventory to load vars for managed_node2 28983 1726883101.95868: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883101.95880: Calling all_plugins_play to load vars for managed_node2 28983 1726883101.95884: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883101.95895: Calling groups_plugins_play to load vars for managed_node2 28983 1726883101.96418: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000174f 28983 1726883101.96423: WORKER PROCESS EXITING 28983 1726883101.98348: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883102.01313: done with get_vars() 28983 1726883102.01359: variable 'ansible_search_path' from source: unknown 28983 1726883102.01361: variable 'ansible_search_path' from source: unknown 28983 1726883102.01414: we have included files to process 28983 1726883102.01416: generating all_blocks data 28983 1726883102.01419: done generating all_blocks data 28983 1726883102.01425: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 28983 1726883102.01427: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 28983 1726883102.01430: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 28983 1726883102.01692: done processing included file 28983 1726883102.01694: iterating over new_blocks loaded from include file 28983 1726883102.01696: in VariableManager get_vars() 28983 1726883102.01719: done with get_vars() 28983 1726883102.01722: filtering new block on tags 28983 1726883102.01758: done filtering new block on tags 28983 1726883102.01760: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml for managed_node2 => (item=tasks/cleanup_profile+device.yml) 28983 1726883102.01767: extending task lists for all hosts with included blocks 28983 1726883102.03759: done extending task lists 28983 1726883102.03761: done processing included files 28983 1726883102.03762: results queue empty 28983 1726883102.03763: checking for any_errors_fatal 28983 1726883102.03768: done checking for any_errors_fatal 28983 1726883102.03769: checking for max_fail_percentage 28983 1726883102.03770: done checking for max_fail_percentage 28983 1726883102.03774: checking to see if all hosts have failed and the running result is not ok 28983 1726883102.03775: done checking to see if all hosts have failed 28983 1726883102.03776: getting the remaining hosts for this loop 28983 1726883102.03778: done getting the remaining hosts for this loop 28983 1726883102.03781: getting the next task for host managed_node2 28983 1726883102.03787: done getting next task for host managed_node2 28983 1726883102.03790: ^ task is: TASK: Cleanup profile and device 28983 1726883102.03794: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883102.03797: getting variables 28983 1726883102.03798: in VariableManager get_vars() 28983 1726883102.03814: Calling all_inventory to load vars for managed_node2 28983 1726883102.03817: Calling groups_inventory to load vars for managed_node2 28983 1726883102.03820: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883102.03827: Calling all_plugins_play to load vars for managed_node2 28983 1726883102.03830: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883102.03836: Calling groups_plugins_play to load vars for managed_node2 28983 1726883102.05953: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883102.08818: done with get_vars() 28983 1726883102.08854: done getting variables 28983 1726883102.08914: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Cleanup profile and device] ********************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml:3 Friday 20 September 2024 21:45:02 -0400 (0:00:00.159) 0:02:12.087 ****** 28983 1726883102.08955: entering _queue_task() for managed_node2/shell 28983 1726883102.09375: worker is 1 (out of 1 available) 28983 1726883102.09389: exiting _queue_task() for managed_node2/shell 28983 1726883102.09401: done queuing things up, now waiting for results queue to drain 28983 1726883102.09403: waiting for pending results... 28983 1726883102.09767: running TaskExecutor() for managed_node2/TASK: Cleanup profile and device 28983 1726883102.09887: in run() - task 0affe814-3a2d-b16d-c0a7-00000000200b 28983 1726883102.10040: variable 'ansible_search_path' from source: unknown 28983 1726883102.10045: variable 'ansible_search_path' from source: unknown 28983 1726883102.10048: calling self._execute() 28983 1726883102.10083: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883102.10096: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883102.10112: variable 'omit' from source: magic vars 28983 1726883102.10556: variable 'ansible_distribution_major_version' from source: facts 28983 1726883102.10578: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883102.10592: variable 'omit' from source: magic vars 28983 1726883102.10661: variable 'omit' from source: magic vars 28983 1726883102.10869: variable 'interface' from source: play vars 28983 1726883102.10901: variable 'omit' from source: magic vars 28983 1726883102.10961: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883102.11011: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883102.11046: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883102.11077: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883102.11095: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883102.11239: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883102.11243: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883102.11248: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883102.11296: Set connection var ansible_connection to ssh 28983 1726883102.11315: Set connection var ansible_shell_executable to /bin/sh 28983 1726883102.11332: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883102.11352: Set connection var ansible_timeout to 10 28983 1726883102.11370: Set connection var ansible_pipelining to False 28983 1726883102.11382: Set connection var ansible_shell_type to sh 28983 1726883102.11412: variable 'ansible_shell_executable' from source: unknown 28983 1726883102.11474: variable 'ansible_connection' from source: unknown 28983 1726883102.11478: variable 'ansible_module_compression' from source: unknown 28983 1726883102.11480: variable 'ansible_shell_type' from source: unknown 28983 1726883102.11482: variable 'ansible_shell_executable' from source: unknown 28983 1726883102.11484: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883102.11486: variable 'ansible_pipelining' from source: unknown 28983 1726883102.11489: variable 'ansible_timeout' from source: unknown 28983 1726883102.11491: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883102.11659: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883102.11690: variable 'omit' from source: magic vars 28983 1726883102.11703: starting attempt loop 28983 1726883102.11800: running the handler 28983 1726883102.11804: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883102.11807: _low_level_execute_command(): starting 28983 1726883102.11809: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726883102.12550: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883102.12565: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883102.12578: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883102.12600: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883102.12614: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726883102.12682: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883102.12733: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883102.12761: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883102.12865: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883102.14652: stdout chunk (state=3): >>>/root <<< 28983 1726883102.14842: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883102.14846: stdout chunk (state=3): >>><<< 28983 1726883102.14848: stderr chunk (state=3): >>><<< 28983 1726883102.14869: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883102.14941: _low_level_execute_command(): starting 28983 1726883102.14946: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726883102.148779-33882-132943020297351 `" && echo ansible-tmp-1726883102.148779-33882-132943020297351="` echo /root/.ansible/tmp/ansible-tmp-1726883102.148779-33882-132943020297351 `" ) && sleep 0' 28983 1726883102.15522: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883102.15540: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883102.15595: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883102.15616: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883102.15637: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726883102.15717: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883102.15759: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883102.15778: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883102.15803: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883102.15914: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883102.17943: stdout chunk (state=3): >>>ansible-tmp-1726883102.148779-33882-132943020297351=/root/.ansible/tmp/ansible-tmp-1726883102.148779-33882-132943020297351 <<< 28983 1726883102.18245: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883102.18248: stdout chunk (state=3): >>><<< 28983 1726883102.18251: stderr chunk (state=3): >>><<< 28983 1726883102.18253: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726883102.148779-33882-132943020297351=/root/.ansible/tmp/ansible-tmp-1726883102.148779-33882-132943020297351 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883102.18256: variable 'ansible_module_compression' from source: unknown 28983 1726883102.18259: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 28983 1726883102.18294: variable 'ansible_facts' from source: unknown 28983 1726883102.18395: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726883102.148779-33882-132943020297351/AnsiballZ_command.py 28983 1726883102.18619: Sending initial data 28983 1726883102.18623: Sent initial data (155 bytes) 28983 1726883102.19187: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883102.19254: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883102.19321: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883102.19344: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883102.19386: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883102.19448: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883102.21141: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726883102.21229: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726883102.21335: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpxnjowpy0 /root/.ansible/tmp/ansible-tmp-1726883102.148779-33882-132943020297351/AnsiballZ_command.py <<< 28983 1726883102.21339: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726883102.148779-33882-132943020297351/AnsiballZ_command.py" <<< 28983 1726883102.21398: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpxnjowpy0" to remote "/root/.ansible/tmp/ansible-tmp-1726883102.148779-33882-132943020297351/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726883102.148779-33882-132943020297351/AnsiballZ_command.py" <<< 28983 1726883102.22878: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883102.22882: stdout chunk (state=3): >>><<< 28983 1726883102.22885: stderr chunk (state=3): >>><<< 28983 1726883102.22887: done transferring module to remote 28983 1726883102.22889: _low_level_execute_command(): starting 28983 1726883102.22891: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726883102.148779-33882-132943020297351/ /root/.ansible/tmp/ansible-tmp-1726883102.148779-33882-132943020297351/AnsiballZ_command.py && sleep 0' 28983 1726883102.23429: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883102.23458: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883102.23477: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883102.23552: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883102.23607: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883102.23627: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883102.23653: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883102.23744: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883102.25840: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883102.25844: stdout chunk (state=3): >>><<< 28983 1726883102.25852: stderr chunk (state=3): >>><<< 28983 1726883102.25928: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883102.25932: _low_level_execute_command(): starting 28983 1726883102.25947: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726883102.148779-33882-132943020297351/AnsiballZ_command.py && sleep 0' 28983 1726883102.27277: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883102.27508: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883102.27556: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883102.28296: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883102.48722: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "Error: unknown connection 'statebr'.\nError: cannot delete unknown connection(s): 'statebr'.\nCould not load file '/etc/sysconfig/network-scripts/ifcfg-statebr'\nCannot find device \"statebr\"", "rc": 1, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "start": "2024-09-20 21:45:02.450394", "end": "2024-09-20 21:45:02.485998", "delta": "0:00:00.035604", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 28983 1726883102.50458: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.46.139 closed. <<< 28983 1726883102.50462: stdout chunk (state=3): >>><<< 28983 1726883102.50513: stderr chunk (state=3): >>><<< 28983 1726883102.50667: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "Error: unknown connection 'statebr'.\nError: cannot delete unknown connection(s): 'statebr'.\nCould not load file '/etc/sysconfig/network-scripts/ifcfg-statebr'\nCannot find device \"statebr\"", "rc": 1, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "start": "2024-09-20 21:45:02.450394", "end": "2024-09-20 21:45:02.485998", "delta": "0:00:00.035604", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.46.139 closed. 28983 1726883102.50719: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726883102.148779-33882-132943020297351/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726883102.50730: _low_level_execute_command(): starting 28983 1726883102.50939: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726883102.148779-33882-132943020297351/ > /dev/null 2>&1 && sleep 0' 28983 1726883102.51953: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883102.52030: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726883102.52038: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 28983 1726883102.52055: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883102.52154: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883102.52158: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883102.52258: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883102.52262: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883102.52286: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883102.52405: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883102.54452: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883102.54839: stderr chunk (state=3): >>><<< 28983 1726883102.54843: stdout chunk (state=3): >>><<< 28983 1726883102.54845: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883102.54848: handler run complete 28983 1726883102.54850: Evaluated conditional (False): False 28983 1726883102.54852: attempt loop complete, returning result 28983 1726883102.54854: _execute() done 28983 1726883102.54856: dumping result to json 28983 1726883102.54858: done dumping result, returning 28983 1726883102.54860: done running TaskExecutor() for managed_node2/TASK: Cleanup profile and device [0affe814-3a2d-b16d-c0a7-00000000200b] 28983 1726883102.54863: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000200b 28983 1726883102.54949: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000200b 28983 1726883102.54953: WORKER PROCESS EXITING fatal: [managed_node2]: FAILED! => { "changed": false, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "delta": "0:00:00.035604", "end": "2024-09-20 21:45:02.485998", "rc": 1, "start": "2024-09-20 21:45:02.450394" } STDERR: Error: unknown connection 'statebr'. Error: cannot delete unknown connection(s): 'statebr'. Could not load file '/etc/sysconfig/network-scripts/ifcfg-statebr' Cannot find device "statebr" MSG: non-zero return code ...ignoring 28983 1726883102.55051: no more pending results, returning what we have 28983 1726883102.55056: results queue empty 28983 1726883102.55057: checking for any_errors_fatal 28983 1726883102.55060: done checking for any_errors_fatal 28983 1726883102.55061: checking for max_fail_percentage 28983 1726883102.55065: done checking for max_fail_percentage 28983 1726883102.55067: checking to see if all hosts have failed and the running result is not ok 28983 1726883102.55068: done checking to see if all hosts have failed 28983 1726883102.55069: getting the remaining hosts for this loop 28983 1726883102.55074: done getting the remaining hosts for this loop 28983 1726883102.55080: getting the next task for host managed_node2 28983 1726883102.55096: done getting next task for host managed_node2 28983 1726883102.55100: ^ task is: TASK: Include the task 'run_test.yml' 28983 1726883102.55103: ^ state is: HOST STATE: block=8, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883102.55108: getting variables 28983 1726883102.55110: in VariableManager get_vars() 28983 1726883102.55370: Calling all_inventory to load vars for managed_node2 28983 1726883102.55377: Calling groups_inventory to load vars for managed_node2 28983 1726883102.55381: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883102.55395: Calling all_plugins_play to load vars for managed_node2 28983 1726883102.55399: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883102.55404: Calling groups_plugins_play to load vars for managed_node2 28983 1726883102.60660: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883102.67019: done with get_vars() 28983 1726883102.67214: done getting variables TASK [Include the task 'run_test.yml'] ***************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_states.yml:124 Friday 20 September 2024 21:45:02 -0400 (0:00:00.584) 0:02:12.671 ****** 28983 1726883102.67382: entering _queue_task() for managed_node2/include_tasks 28983 1726883102.68301: worker is 1 (out of 1 available) 28983 1726883102.68315: exiting _queue_task() for managed_node2/include_tasks 28983 1726883102.68327: done queuing things up, now waiting for results queue to drain 28983 1726883102.68329: waiting for pending results... 28983 1726883102.69158: running TaskExecutor() for managed_node2/TASK: Include the task 'run_test.yml' 28983 1726883102.69240: in run() - task 0affe814-3a2d-b16d-c0a7-000000000017 28983 1726883102.69244: variable 'ansible_search_path' from source: unknown 28983 1726883102.69481: calling self._execute() 28983 1726883102.69557: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883102.69606: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883102.69626: variable 'omit' from source: magic vars 28983 1726883102.70839: variable 'ansible_distribution_major_version' from source: facts 28983 1726883102.70844: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883102.70849: _execute() done 28983 1726883102.70852: dumping result to json 28983 1726883102.70854: done dumping result, returning 28983 1726883102.70858: done running TaskExecutor() for managed_node2/TASK: Include the task 'run_test.yml' [0affe814-3a2d-b16d-c0a7-000000000017] 28983 1726883102.70860: sending task result for task 0affe814-3a2d-b16d-c0a7-000000000017 28983 1726883102.70955: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000000017 28983 1726883102.70994: no more pending results, returning what we have 28983 1726883102.71000: in VariableManager get_vars() 28983 1726883102.71063: Calling all_inventory to load vars for managed_node2 28983 1726883102.71069: Calling groups_inventory to load vars for managed_node2 28983 1726883102.71076: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883102.71092: Calling all_plugins_play to load vars for managed_node2 28983 1726883102.71096: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883102.71100: Calling groups_plugins_play to load vars for managed_node2 28983 1726883102.71981: WORKER PROCESS EXITING 28983 1726883102.76601: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883102.82898: done with get_vars() 28983 1726883102.83049: variable 'ansible_search_path' from source: unknown 28983 1726883102.83068: we have included files to process 28983 1726883102.83069: generating all_blocks data 28983 1726883102.83074: done generating all_blocks data 28983 1726883102.83081: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 28983 1726883102.83082: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 28983 1726883102.83086: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 28983 1726883102.84299: in VariableManager get_vars() 28983 1726883102.84326: done with get_vars() 28983 1726883102.84517: in VariableManager get_vars() 28983 1726883102.84543: done with get_vars() 28983 1726883102.84662: in VariableManager get_vars() 28983 1726883102.84803: done with get_vars() 28983 1726883102.84858: in VariableManager get_vars() 28983 1726883102.84884: done with get_vars() 28983 1726883102.85055: in VariableManager get_vars() 28983 1726883102.85081: done with get_vars() 28983 1726883102.86270: in VariableManager get_vars() 28983 1726883102.86295: done with get_vars() 28983 1726883102.86311: done processing included file 28983 1726883102.86313: iterating over new_blocks loaded from include file 28983 1726883102.86314: in VariableManager get_vars() 28983 1726883102.86331: done with get_vars() 28983 1726883102.86333: filtering new block on tags 28983 1726883102.86697: done filtering new block on tags 28983 1726883102.86700: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml for managed_node2 28983 1726883102.86707: extending task lists for all hosts with included blocks 28983 1726883102.86801: done extending task lists 28983 1726883102.86802: done processing included files 28983 1726883102.86803: results queue empty 28983 1726883102.86804: checking for any_errors_fatal 28983 1726883102.86811: done checking for any_errors_fatal 28983 1726883102.86812: checking for max_fail_percentage 28983 1726883102.86813: done checking for max_fail_percentage 28983 1726883102.86814: checking to see if all hosts have failed and the running result is not ok 28983 1726883102.86815: done checking to see if all hosts have failed 28983 1726883102.86816: getting the remaining hosts for this loop 28983 1726883102.86818: done getting the remaining hosts for this loop 28983 1726883102.86821: getting the next task for host managed_node2 28983 1726883102.86825: done getting next task for host managed_node2 28983 1726883102.86828: ^ task is: TASK: TEST: {{ lsr_description }} 28983 1726883102.86831: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883102.86996: getting variables 28983 1726883102.86998: in VariableManager get_vars() 28983 1726883102.87012: Calling all_inventory to load vars for managed_node2 28983 1726883102.87014: Calling groups_inventory to load vars for managed_node2 28983 1726883102.87018: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883102.87024: Calling all_plugins_play to load vars for managed_node2 28983 1726883102.87027: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883102.87031: Calling groups_plugins_play to load vars for managed_node2 28983 1726883103.03451: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883103.09712: done with get_vars() 28983 1726883103.09863: done getting variables 28983 1726883103.09917: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 28983 1726883103.10033: variable 'lsr_description' from source: include params TASK [TEST: I will not get an error when I try to remove an absent profile] **** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:5 Friday 20 September 2024 21:45:03 -0400 (0:00:00.429) 0:02:13.100 ****** 28983 1726883103.10284: entering _queue_task() for managed_node2/debug 28983 1726883103.11190: worker is 1 (out of 1 available) 28983 1726883103.11204: exiting _queue_task() for managed_node2/debug 28983 1726883103.11217: done queuing things up, now waiting for results queue to drain 28983 1726883103.11220: waiting for pending results... 28983 1726883103.11746: running TaskExecutor() for managed_node2/TASK: TEST: I will not get an error when I try to remove an absent profile 28983 1726883103.12018: in run() - task 0affe814-3a2d-b16d-c0a7-0000000020ad 28983 1726883103.12165: variable 'ansible_search_path' from source: unknown 28983 1726883103.12179: variable 'ansible_search_path' from source: unknown 28983 1726883103.12225: calling self._execute() 28983 1726883103.12531: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883103.12554: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883103.12578: variable 'omit' from source: magic vars 28983 1726883103.13637: variable 'ansible_distribution_major_version' from source: facts 28983 1726883103.13657: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883103.13700: variable 'omit' from source: magic vars 28983 1726883103.13821: variable 'omit' from source: magic vars 28983 1726883103.14082: variable 'lsr_description' from source: include params 28983 1726883103.14152: variable 'omit' from source: magic vars 28983 1726883103.14281: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883103.14379: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883103.14413: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883103.14477: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883103.14561: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883103.14715: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883103.14719: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883103.14721: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883103.14949: Set connection var ansible_connection to ssh 28983 1726883103.15018: Set connection var ansible_shell_executable to /bin/sh 28983 1726883103.15239: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883103.15243: Set connection var ansible_timeout to 10 28983 1726883103.15245: Set connection var ansible_pipelining to False 28983 1726883103.15248: Set connection var ansible_shell_type to sh 28983 1726883103.15250: variable 'ansible_shell_executable' from source: unknown 28983 1726883103.15253: variable 'ansible_connection' from source: unknown 28983 1726883103.15255: variable 'ansible_module_compression' from source: unknown 28983 1726883103.15257: variable 'ansible_shell_type' from source: unknown 28983 1726883103.15260: variable 'ansible_shell_executable' from source: unknown 28983 1726883103.15262: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883103.15265: variable 'ansible_pipelining' from source: unknown 28983 1726883103.15268: variable 'ansible_timeout' from source: unknown 28983 1726883103.15271: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883103.15640: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883103.15711: variable 'omit' from source: magic vars 28983 1726883103.15717: starting attempt loop 28983 1726883103.15721: running the handler 28983 1726883103.15786: handler run complete 28983 1726883103.16043: attempt loop complete, returning result 28983 1726883103.16047: _execute() done 28983 1726883103.16050: dumping result to json 28983 1726883103.16052: done dumping result, returning 28983 1726883103.16055: done running TaskExecutor() for managed_node2/TASK: TEST: I will not get an error when I try to remove an absent profile [0affe814-3a2d-b16d-c0a7-0000000020ad] 28983 1726883103.16057: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000020ad 28983 1726883103.16128: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000020ad 28983 1726883103.16131: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: ########## I will not get an error when I try to remove an absent profile ########## 28983 1726883103.16208: no more pending results, returning what we have 28983 1726883103.16213: results queue empty 28983 1726883103.16214: checking for any_errors_fatal 28983 1726883103.16216: done checking for any_errors_fatal 28983 1726883103.16217: checking for max_fail_percentage 28983 1726883103.16219: done checking for max_fail_percentage 28983 1726883103.16220: checking to see if all hosts have failed and the running result is not ok 28983 1726883103.16221: done checking to see if all hosts have failed 28983 1726883103.16222: getting the remaining hosts for this loop 28983 1726883103.16225: done getting the remaining hosts for this loop 28983 1726883103.16231: getting the next task for host managed_node2 28983 1726883103.16241: done getting next task for host managed_node2 28983 1726883103.16245: ^ task is: TASK: Show item 28983 1726883103.16249: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883103.16260: getting variables 28983 1726883103.16262: in VariableManager get_vars() 28983 1726883103.16316: Calling all_inventory to load vars for managed_node2 28983 1726883103.16320: Calling groups_inventory to load vars for managed_node2 28983 1726883103.16325: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883103.16640: Calling all_plugins_play to load vars for managed_node2 28983 1726883103.16650: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883103.16655: Calling groups_plugins_play to load vars for managed_node2 28983 1726883103.21395: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883103.29327: done with get_vars() 28983 1726883103.29484: done getting variables 28983 1726883103.29707: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show item] *************************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:9 Friday 20 September 2024 21:45:03 -0400 (0:00:00.194) 0:02:13.295 ****** 28983 1726883103.29747: entering _queue_task() for managed_node2/debug 28983 1726883103.31141: worker is 1 (out of 1 available) 28983 1726883103.31156: exiting _queue_task() for managed_node2/debug 28983 1726883103.31288: done queuing things up, now waiting for results queue to drain 28983 1726883103.31291: waiting for pending results... 28983 1726883103.31703: running TaskExecutor() for managed_node2/TASK: Show item 28983 1726883103.32013: in run() - task 0affe814-3a2d-b16d-c0a7-0000000020ae 28983 1726883103.32030: variable 'ansible_search_path' from source: unknown 28983 1726883103.32035: variable 'ansible_search_path' from source: unknown 28983 1726883103.32266: variable 'omit' from source: magic vars 28983 1726883103.32502: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883103.32552: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883103.32556: variable 'omit' from source: magic vars 28983 1726883103.33100: variable 'ansible_distribution_major_version' from source: facts 28983 1726883103.33103: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883103.33107: variable 'omit' from source: magic vars 28983 1726883103.33159: variable 'omit' from source: magic vars 28983 1726883103.33213: variable 'item' from source: unknown 28983 1726883103.33440: variable 'item' from source: unknown 28983 1726883103.33607: variable 'omit' from source: magic vars 28983 1726883103.33753: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883103.33762: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883103.33788: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883103.33861: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883103.33876: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883103.34111: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883103.34115: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883103.34120: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883103.34615: Set connection var ansible_connection to ssh 28983 1726883103.34638: Set connection var ansible_shell_executable to /bin/sh 28983 1726883103.34679: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883103.34692: Set connection var ansible_timeout to 10 28983 1726883103.35042: Set connection var ansible_pipelining to False 28983 1726883103.35045: Set connection var ansible_shell_type to sh 28983 1726883103.35048: variable 'ansible_shell_executable' from source: unknown 28983 1726883103.35051: variable 'ansible_connection' from source: unknown 28983 1726883103.35053: variable 'ansible_module_compression' from source: unknown 28983 1726883103.35056: variable 'ansible_shell_type' from source: unknown 28983 1726883103.35058: variable 'ansible_shell_executable' from source: unknown 28983 1726883103.35060: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883103.35063: variable 'ansible_pipelining' from source: unknown 28983 1726883103.35065: variable 'ansible_timeout' from source: unknown 28983 1726883103.35068: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883103.35132: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883103.35157: variable 'omit' from source: magic vars 28983 1726883103.35164: starting attempt loop 28983 1726883103.35167: running the handler 28983 1726883103.35229: variable 'lsr_description' from source: include params 28983 1726883103.35440: variable 'lsr_description' from source: include params 28983 1726883103.35443: handler run complete 28983 1726883103.35446: attempt loop complete, returning result 28983 1726883103.35448: variable 'item' from source: unknown 28983 1726883103.35480: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_description) => { "ansible_loop_var": "item", "item": "lsr_description", "lsr_description": "I will not get an error when I try to remove an absent profile" } 28983 1726883103.35668: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883103.35672: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883103.35675: variable 'omit' from source: magic vars 28983 1726883103.36141: variable 'ansible_distribution_major_version' from source: facts 28983 1726883103.36144: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883103.36147: variable 'omit' from source: magic vars 28983 1726883103.36150: variable 'omit' from source: magic vars 28983 1726883103.36152: variable 'item' from source: unknown 28983 1726883103.36155: variable 'item' from source: unknown 28983 1726883103.36157: variable 'omit' from source: magic vars 28983 1726883103.36159: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883103.36162: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883103.36165: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883103.36167: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883103.36169: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883103.36171: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883103.36411: Set connection var ansible_connection to ssh 28983 1726883103.36415: Set connection var ansible_shell_executable to /bin/sh 28983 1726883103.36418: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883103.36421: Set connection var ansible_timeout to 10 28983 1726883103.36423: Set connection var ansible_pipelining to False 28983 1726883103.36426: Set connection var ansible_shell_type to sh 28983 1726883103.36428: variable 'ansible_shell_executable' from source: unknown 28983 1726883103.36431: variable 'ansible_connection' from source: unknown 28983 1726883103.36433: variable 'ansible_module_compression' from source: unknown 28983 1726883103.36444: variable 'ansible_shell_type' from source: unknown 28983 1726883103.36448: variable 'ansible_shell_executable' from source: unknown 28983 1726883103.36451: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883103.36454: variable 'ansible_pipelining' from source: unknown 28983 1726883103.36457: variable 'ansible_timeout' from source: unknown 28983 1726883103.36460: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883103.36557: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883103.36568: variable 'omit' from source: magic vars 28983 1726883103.36614: starting attempt loop 28983 1726883103.36617: running the handler 28983 1726883103.36620: variable 'lsr_setup' from source: include params 28983 1726883103.36708: variable 'lsr_setup' from source: include params 28983 1726883103.36763: handler run complete 28983 1726883103.36790: attempt loop complete, returning result 28983 1726883103.36831: variable 'item' from source: unknown 28983 1726883103.37091: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_setup) => { "ansible_loop_var": "item", "item": "lsr_setup", "lsr_setup": [ "tasks/create_bridge_profile.yml", "tasks/activate_profile.yml", "tasks/remove+down_profile.yml" ] } 28983 1726883103.37180: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883103.37322: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883103.37326: variable 'omit' from source: magic vars 28983 1726883103.37631: variable 'ansible_distribution_major_version' from source: facts 28983 1726883103.37635: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883103.37641: variable 'omit' from source: magic vars 28983 1726883103.37724: variable 'omit' from source: magic vars 28983 1726883103.37948: variable 'item' from source: unknown 28983 1726883103.38044: variable 'item' from source: unknown 28983 1726883103.38048: variable 'omit' from source: magic vars 28983 1726883103.38067: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883103.38073: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883103.38139: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883103.38143: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883103.38145: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883103.38147: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883103.38399: Set connection var ansible_connection to ssh 28983 1726883103.38466: Set connection var ansible_shell_executable to /bin/sh 28983 1726883103.38496: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883103.38508: Set connection var ansible_timeout to 10 28983 1726883103.38516: Set connection var ansible_pipelining to False 28983 1726883103.38518: Set connection var ansible_shell_type to sh 28983 1726883103.38595: variable 'ansible_shell_executable' from source: unknown 28983 1726883103.38610: variable 'ansible_connection' from source: unknown 28983 1726883103.38614: variable 'ansible_module_compression' from source: unknown 28983 1726883103.38617: variable 'ansible_shell_type' from source: unknown 28983 1726883103.38619: variable 'ansible_shell_executable' from source: unknown 28983 1726883103.38621: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883103.38624: variable 'ansible_pipelining' from source: unknown 28983 1726883103.38626: variable 'ansible_timeout' from source: unknown 28983 1726883103.38633: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883103.38845: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883103.38848: variable 'omit' from source: magic vars 28983 1726883103.38851: starting attempt loop 28983 1726883103.38853: running the handler 28983 1726883103.38855: variable 'lsr_test' from source: include params 28983 1726883103.39060: variable 'lsr_test' from source: include params 28983 1726883103.39063: handler run complete 28983 1726883103.39065: attempt loop complete, returning result 28983 1726883103.39068: variable 'item' from source: unknown 28983 1726883103.39070: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_test) => { "ansible_loop_var": "item", "item": "lsr_test", "lsr_test": [ "tasks/remove+down_profile.yml" ] } 28983 1726883103.39193: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883103.39197: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883103.39200: variable 'omit' from source: magic vars 28983 1726883103.39387: variable 'ansible_distribution_major_version' from source: facts 28983 1726883103.39390: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883103.39393: variable 'omit' from source: magic vars 28983 1726883103.39488: variable 'omit' from source: magic vars 28983 1726883103.39491: variable 'item' from source: unknown 28983 1726883103.39529: variable 'item' from source: unknown 28983 1726883103.39547: variable 'omit' from source: magic vars 28983 1726883103.39567: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883103.39584: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883103.39595: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883103.39605: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883103.39609: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883103.39614: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883103.39710: Set connection var ansible_connection to ssh 28983 1726883103.39819: Set connection var ansible_shell_executable to /bin/sh 28983 1726883103.39823: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883103.39825: Set connection var ansible_timeout to 10 28983 1726883103.39828: Set connection var ansible_pipelining to False 28983 1726883103.39830: Set connection var ansible_shell_type to sh 28983 1726883103.39832: variable 'ansible_shell_executable' from source: unknown 28983 1726883103.39837: variable 'ansible_connection' from source: unknown 28983 1726883103.39839: variable 'ansible_module_compression' from source: unknown 28983 1726883103.39841: variable 'ansible_shell_type' from source: unknown 28983 1726883103.39844: variable 'ansible_shell_executable' from source: unknown 28983 1726883103.39846: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883103.39848: variable 'ansible_pipelining' from source: unknown 28983 1726883103.39850: variable 'ansible_timeout' from source: unknown 28983 1726883103.39852: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883103.40037: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883103.40042: variable 'omit' from source: magic vars 28983 1726883103.40045: starting attempt loop 28983 1726883103.40047: running the handler 28983 1726883103.40050: variable 'lsr_assert' from source: include params 28983 1726883103.40052: variable 'lsr_assert' from source: include params 28983 1726883103.40063: handler run complete 28983 1726883103.40082: attempt loop complete, returning result 28983 1726883103.40099: variable 'item' from source: unknown 28983 1726883103.40439: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_assert) => { "ansible_loop_var": "item", "item": "lsr_assert", "lsr_assert": [ "tasks/assert_profile_absent.yml", "tasks/get_NetworkManager_NVR.yml" ] } 28983 1726883103.40515: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883103.40519: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883103.40522: variable 'omit' from source: magic vars 28983 1726883103.40639: variable 'ansible_distribution_major_version' from source: facts 28983 1726883103.40647: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883103.40650: variable 'omit' from source: magic vars 28983 1726883103.40652: variable 'omit' from source: magic vars 28983 1726883103.40664: variable 'item' from source: unknown 28983 1726883103.40839: variable 'item' from source: unknown 28983 1726883103.40842: variable 'omit' from source: magic vars 28983 1726883103.40846: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883103.40848: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883103.40851: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883103.40855: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883103.40858: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883103.40861: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883103.40979: Set connection var ansible_connection to ssh 28983 1726883103.40982: Set connection var ansible_shell_executable to /bin/sh 28983 1726883103.40985: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883103.40988: Set connection var ansible_timeout to 10 28983 1726883103.40990: Set connection var ansible_pipelining to False 28983 1726883103.40993: Set connection var ansible_shell_type to sh 28983 1726883103.40996: variable 'ansible_shell_executable' from source: unknown 28983 1726883103.40998: variable 'ansible_connection' from source: unknown 28983 1726883103.41001: variable 'ansible_module_compression' from source: unknown 28983 1726883103.41003: variable 'ansible_shell_type' from source: unknown 28983 1726883103.41006: variable 'ansible_shell_executable' from source: unknown 28983 1726883103.41008: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883103.41087: variable 'ansible_pipelining' from source: unknown 28983 1726883103.41091: variable 'ansible_timeout' from source: unknown 28983 1726883103.41093: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883103.41135: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883103.41145: variable 'omit' from source: magic vars 28983 1726883103.41148: starting attempt loop 28983 1726883103.41239: running the handler 28983 1726883103.41243: variable 'lsr_assert_when' from source: include params 28983 1726883103.41254: variable 'lsr_assert_when' from source: include params 28983 1726883103.41438: variable 'network_provider' from source: set_fact 28983 1726883103.41442: handler run complete 28983 1726883103.41445: attempt loop complete, returning result 28983 1726883103.41463: variable 'item' from source: unknown 28983 1726883103.41546: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_assert_when) => { "ansible_loop_var": "item", "item": "lsr_assert_when", "lsr_assert_when": [ { "condition": true, "what": "tasks/assert_device_absent.yml" } ] } 28983 1726883103.41856: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883103.41860: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883103.41864: variable 'omit' from source: magic vars 28983 1726883103.41867: variable 'ansible_distribution_major_version' from source: facts 28983 1726883103.41870: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883103.41875: variable 'omit' from source: magic vars 28983 1726883103.42100: variable 'omit' from source: magic vars 28983 1726883103.42104: variable 'item' from source: unknown 28983 1726883103.42107: variable 'item' from source: unknown 28983 1726883103.42109: variable 'omit' from source: magic vars 28983 1726883103.42111: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883103.42113: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883103.42115: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883103.42118: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883103.42120: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883103.42122: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883103.42317: Set connection var ansible_connection to ssh 28983 1726883103.42321: Set connection var ansible_shell_executable to /bin/sh 28983 1726883103.42323: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883103.42326: Set connection var ansible_timeout to 10 28983 1726883103.42328: Set connection var ansible_pipelining to False 28983 1726883103.42330: Set connection var ansible_shell_type to sh 28983 1726883103.42332: variable 'ansible_shell_executable' from source: unknown 28983 1726883103.42336: variable 'ansible_connection' from source: unknown 28983 1726883103.42338: variable 'ansible_module_compression' from source: unknown 28983 1726883103.42341: variable 'ansible_shell_type' from source: unknown 28983 1726883103.42343: variable 'ansible_shell_executable' from source: unknown 28983 1726883103.42345: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883103.42347: variable 'ansible_pipelining' from source: unknown 28983 1726883103.42349: variable 'ansible_timeout' from source: unknown 28983 1726883103.42351: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883103.42416: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883103.42428: variable 'omit' from source: magic vars 28983 1726883103.42435: starting attempt loop 28983 1726883103.42438: running the handler 28983 1726883103.42461: variable 'lsr_fail_debug' from source: play vars 28983 1726883103.42641: variable 'lsr_fail_debug' from source: play vars 28983 1726883103.42645: handler run complete 28983 1726883103.42647: attempt loop complete, returning result 28983 1726883103.42650: variable 'item' from source: unknown 28983 1726883103.42676: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_fail_debug) => { "ansible_loop_var": "item", "item": "lsr_fail_debug", "lsr_fail_debug": [ "__network_connections_result" ] } 28983 1726883103.42779: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883103.42789: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883103.42801: variable 'omit' from source: magic vars 28983 1726883103.43080: variable 'ansible_distribution_major_version' from source: facts 28983 1726883103.43083: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883103.43087: variable 'omit' from source: magic vars 28983 1726883103.43089: variable 'omit' from source: magic vars 28983 1726883103.43098: variable 'item' from source: unknown 28983 1726883103.43179: variable 'item' from source: unknown 28983 1726883103.43195: variable 'omit' from source: magic vars 28983 1726883103.43541: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883103.43548: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883103.43551: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883103.43553: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883103.43556: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883103.43558: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883103.43561: Set connection var ansible_connection to ssh 28983 1726883103.43563: Set connection var ansible_shell_executable to /bin/sh 28983 1726883103.43565: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883103.43567: Set connection var ansible_timeout to 10 28983 1726883103.43569: Set connection var ansible_pipelining to False 28983 1726883103.43575: Set connection var ansible_shell_type to sh 28983 1726883103.43577: variable 'ansible_shell_executable' from source: unknown 28983 1726883103.43579: variable 'ansible_connection' from source: unknown 28983 1726883103.43581: variable 'ansible_module_compression' from source: unknown 28983 1726883103.43583: variable 'ansible_shell_type' from source: unknown 28983 1726883103.43585: variable 'ansible_shell_executable' from source: unknown 28983 1726883103.43588: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883103.43590: variable 'ansible_pipelining' from source: unknown 28983 1726883103.43592: variable 'ansible_timeout' from source: unknown 28983 1726883103.43594: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883103.43596: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883103.43598: variable 'omit' from source: magic vars 28983 1726883103.43600: starting attempt loop 28983 1726883103.43602: running the handler 28983 1726883103.43605: variable 'lsr_cleanup' from source: include params 28983 1726883103.43760: variable 'lsr_cleanup' from source: include params 28983 1726883103.43763: handler run complete 28983 1726883103.43766: attempt loop complete, returning result 28983 1726883103.43769: variable 'item' from source: unknown 28983 1726883103.43815: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_cleanup) => { "ansible_loop_var": "item", "item": "lsr_cleanup", "lsr_cleanup": [ "tasks/cleanup_profile+device.yml", "tasks/check_network_dns.yml" ] } 28983 1726883103.43912: dumping result to json 28983 1726883103.43916: done dumping result, returning 28983 1726883103.43920: done running TaskExecutor() for managed_node2/TASK: Show item [0affe814-3a2d-b16d-c0a7-0000000020ae] 28983 1726883103.43923: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000020ae 28983 1726883103.44198: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000020ae 28983 1726883103.44201: WORKER PROCESS EXITING 28983 1726883103.44411: no more pending results, returning what we have 28983 1726883103.44415: results queue empty 28983 1726883103.44416: checking for any_errors_fatal 28983 1726883103.44422: done checking for any_errors_fatal 28983 1726883103.44423: checking for max_fail_percentage 28983 1726883103.44424: done checking for max_fail_percentage 28983 1726883103.44425: checking to see if all hosts have failed and the running result is not ok 28983 1726883103.44426: done checking to see if all hosts have failed 28983 1726883103.44427: getting the remaining hosts for this loop 28983 1726883103.44429: done getting the remaining hosts for this loop 28983 1726883103.44433: getting the next task for host managed_node2 28983 1726883103.44446: done getting next task for host managed_node2 28983 1726883103.44450: ^ task is: TASK: Include the task 'show_interfaces.yml' 28983 1726883103.44453: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883103.44457: getting variables 28983 1726883103.44458: in VariableManager get_vars() 28983 1726883103.44499: Calling all_inventory to load vars for managed_node2 28983 1726883103.44502: Calling groups_inventory to load vars for managed_node2 28983 1726883103.44506: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883103.44516: Calling all_plugins_play to load vars for managed_node2 28983 1726883103.44519: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883103.44523: Calling groups_plugins_play to load vars for managed_node2 28983 1726883103.47321: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883103.53681: done with get_vars() 28983 1726883103.53842: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:21 Friday 20 September 2024 21:45:03 -0400 (0:00:00.243) 0:02:13.538 ****** 28983 1726883103.54077: entering _queue_task() for managed_node2/include_tasks 28983 1726883103.54887: worker is 1 (out of 1 available) 28983 1726883103.54900: exiting _queue_task() for managed_node2/include_tasks 28983 1726883103.54913: done queuing things up, now waiting for results queue to drain 28983 1726883103.55030: waiting for pending results... 28983 1726883103.55596: running TaskExecutor() for managed_node2/TASK: Include the task 'show_interfaces.yml' 28983 1726883103.55726: in run() - task 0affe814-3a2d-b16d-c0a7-0000000020af 28983 1726883103.55944: variable 'ansible_search_path' from source: unknown 28983 1726883103.55948: variable 'ansible_search_path' from source: unknown 28983 1726883103.55988: calling self._execute() 28983 1726883103.56102: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883103.56109: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883103.56123: variable 'omit' from source: magic vars 28983 1726883103.57116: variable 'ansible_distribution_major_version' from source: facts 28983 1726883103.57246: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883103.57255: _execute() done 28983 1726883103.57258: dumping result to json 28983 1726883103.57264: done dumping result, returning 28983 1726883103.57274: done running TaskExecutor() for managed_node2/TASK: Include the task 'show_interfaces.yml' [0affe814-3a2d-b16d-c0a7-0000000020af] 28983 1726883103.57279: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000020af 28983 1726883103.57403: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000020af 28983 1726883103.57406: WORKER PROCESS EXITING 28983 1726883103.57448: no more pending results, returning what we have 28983 1726883103.57454: in VariableManager get_vars() 28983 1726883103.57512: Calling all_inventory to load vars for managed_node2 28983 1726883103.57516: Calling groups_inventory to load vars for managed_node2 28983 1726883103.57519: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883103.57636: Calling all_plugins_play to load vars for managed_node2 28983 1726883103.57649: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883103.57655: Calling groups_plugins_play to load vars for managed_node2 28983 1726883103.62883: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883103.69449: done with get_vars() 28983 1726883103.69493: variable 'ansible_search_path' from source: unknown 28983 1726883103.69495: variable 'ansible_search_path' from source: unknown 28983 1726883103.69548: we have included files to process 28983 1726883103.69549: generating all_blocks data 28983 1726883103.69552: done generating all_blocks data 28983 1726883103.69644: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 28983 1726883103.69646: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 28983 1726883103.69650: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 28983 1726883103.69950: in VariableManager get_vars() 28983 1726883103.69982: done with get_vars() 28983 1726883103.70294: done processing included file 28983 1726883103.70296: iterating over new_blocks loaded from include file 28983 1726883103.70298: in VariableManager get_vars() 28983 1726883103.70430: done with get_vars() 28983 1726883103.70433: filtering new block on tags 28983 1726883103.70486: done filtering new block on tags 28983 1726883103.70489: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node2 28983 1726883103.70495: extending task lists for all hosts with included blocks 28983 1726883103.71843: done extending task lists 28983 1726883103.71845: done processing included files 28983 1726883103.71846: results queue empty 28983 1726883103.71847: checking for any_errors_fatal 28983 1726883103.71854: done checking for any_errors_fatal 28983 1726883103.71855: checking for max_fail_percentage 28983 1726883103.71857: done checking for max_fail_percentage 28983 1726883103.71858: checking to see if all hosts have failed and the running result is not ok 28983 1726883103.71859: done checking to see if all hosts have failed 28983 1726883103.71860: getting the remaining hosts for this loop 28983 1726883103.71862: done getting the remaining hosts for this loop 28983 1726883103.71980: getting the next task for host managed_node2 28983 1726883103.71986: done getting next task for host managed_node2 28983 1726883103.71988: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 28983 1726883103.71992: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883103.71995: getting variables 28983 1726883103.71997: in VariableManager get_vars() 28983 1726883103.72011: Calling all_inventory to load vars for managed_node2 28983 1726883103.72014: Calling groups_inventory to load vars for managed_node2 28983 1726883103.72017: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883103.72023: Calling all_plugins_play to load vars for managed_node2 28983 1726883103.72027: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883103.72030: Calling groups_plugins_play to load vars for managed_node2 28983 1726883103.76337: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883103.82756: done with get_vars() 28983 1726883103.82799: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 21:45:03 -0400 (0:00:00.289) 0:02:13.827 ****** 28983 1726883103.83019: entering _queue_task() for managed_node2/include_tasks 28983 1726883103.83905: worker is 1 (out of 1 available) 28983 1726883103.84149: exiting _queue_task() for managed_node2/include_tasks 28983 1726883103.84164: done queuing things up, now waiting for results queue to drain 28983 1726883103.84166: waiting for pending results... 28983 1726883103.84697: running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' 28983 1726883103.85121: in run() - task 0affe814-3a2d-b16d-c0a7-0000000020d6 28983 1726883103.85125: variable 'ansible_search_path' from source: unknown 28983 1726883103.85128: variable 'ansible_search_path' from source: unknown 28983 1726883103.85132: calling self._execute() 28983 1726883103.85359: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883103.85379: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883103.85400: variable 'omit' from source: magic vars 28983 1726883103.86454: variable 'ansible_distribution_major_version' from source: facts 28983 1726883103.86483: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883103.86498: _execute() done 28983 1726883103.86549: dumping result to json 28983 1726883103.86558: done dumping result, returning 28983 1726883103.86570: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' [0affe814-3a2d-b16d-c0a7-0000000020d6] 28983 1726883103.86590: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000020d6 28983 1726883103.86862: no more pending results, returning what we have 28983 1726883103.86868: in VariableManager get_vars() 28983 1726883103.86931: Calling all_inventory to load vars for managed_node2 28983 1726883103.86936: Calling groups_inventory to load vars for managed_node2 28983 1726883103.86940: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883103.86958: Calling all_plugins_play to load vars for managed_node2 28983 1726883103.86963: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883103.86968: Calling groups_plugins_play to load vars for managed_node2 28983 1726883103.87543: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000020d6 28983 1726883103.87547: WORKER PROCESS EXITING 28983 1726883103.91836: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883103.99198: done with get_vars() 28983 1726883103.99292: variable 'ansible_search_path' from source: unknown 28983 1726883103.99294: variable 'ansible_search_path' from source: unknown 28983 1726883103.99344: we have included files to process 28983 1726883103.99346: generating all_blocks data 28983 1726883103.99348: done generating all_blocks data 28983 1726883103.99350: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 28983 1726883103.99352: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 28983 1726883103.99355: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 28983 1726883104.00157: done processing included file 28983 1726883104.00160: iterating over new_blocks loaded from include file 28983 1726883104.00162: in VariableManager get_vars() 28983 1726883104.00186: done with get_vars() 28983 1726883104.00189: filtering new block on tags 28983 1726883104.00241: done filtering new block on tags 28983 1726883104.00245: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node2 28983 1726883104.00251: extending task lists for all hosts with included blocks 28983 1726883104.00954: done extending task lists 28983 1726883104.00956: done processing included files 28983 1726883104.00957: results queue empty 28983 1726883104.00958: checking for any_errors_fatal 28983 1726883104.00962: done checking for any_errors_fatal 28983 1726883104.00963: checking for max_fail_percentage 28983 1726883104.00964: done checking for max_fail_percentage 28983 1726883104.00965: checking to see if all hosts have failed and the running result is not ok 28983 1726883104.00966: done checking to see if all hosts have failed 28983 1726883104.00967: getting the remaining hosts for this loop 28983 1726883104.00969: done getting the remaining hosts for this loop 28983 1726883104.00972: getting the next task for host managed_node2 28983 1726883104.00978: done getting next task for host managed_node2 28983 1726883104.00980: ^ task is: TASK: Gather current interface info 28983 1726883104.00985: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883104.00988: getting variables 28983 1726883104.00990: in VariableManager get_vars() 28983 1726883104.01004: Calling all_inventory to load vars for managed_node2 28983 1726883104.01007: Calling groups_inventory to load vars for managed_node2 28983 1726883104.01010: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883104.01017: Calling all_plugins_play to load vars for managed_node2 28983 1726883104.01020: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883104.01024: Calling groups_plugins_play to load vars for managed_node2 28983 1726883104.04795: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883104.11279: done with get_vars() 28983 1726883104.11329: done getting variables 28983 1726883104.11388: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 21:45:04 -0400 (0:00:00.284) 0:02:14.112 ****** 28983 1726883104.11429: entering _queue_task() for managed_node2/command 28983 1726883104.12535: worker is 1 (out of 1 available) 28983 1726883104.12550: exiting _queue_task() for managed_node2/command 28983 1726883104.12564: done queuing things up, now waiting for results queue to drain 28983 1726883104.12566: waiting for pending results... 28983 1726883104.13157: running TaskExecutor() for managed_node2/TASK: Gather current interface info 28983 1726883104.13433: in run() - task 0affe814-3a2d-b16d-c0a7-000000002111 28983 1726883104.13463: variable 'ansible_search_path' from source: unknown 28983 1726883104.13480: variable 'ansible_search_path' from source: unknown 28983 1726883104.13640: calling self._execute() 28983 1726883104.13824: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883104.13919: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883104.13938: variable 'omit' from source: magic vars 28983 1726883104.15014: variable 'ansible_distribution_major_version' from source: facts 28983 1726883104.15056: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883104.15081: variable 'omit' from source: magic vars 28983 1726883104.15345: variable 'omit' from source: magic vars 28983 1726883104.15348: variable 'omit' from source: magic vars 28983 1726883104.15565: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883104.15619: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883104.15754: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883104.15783: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883104.15819: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883104.15858: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883104.15862: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883104.15867: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883104.16030: Set connection var ansible_connection to ssh 28983 1726883104.16050: Set connection var ansible_shell_executable to /bin/sh 28983 1726883104.16054: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883104.16066: Set connection var ansible_timeout to 10 28983 1726883104.16075: Set connection var ansible_pipelining to False 28983 1726883104.16079: Set connection var ansible_shell_type to sh 28983 1726883104.16109: variable 'ansible_shell_executable' from source: unknown 28983 1726883104.16113: variable 'ansible_connection' from source: unknown 28983 1726883104.16117: variable 'ansible_module_compression' from source: unknown 28983 1726883104.16120: variable 'ansible_shell_type' from source: unknown 28983 1726883104.16125: variable 'ansible_shell_executable' from source: unknown 28983 1726883104.16138: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883104.16142: variable 'ansible_pipelining' from source: unknown 28983 1726883104.16147: variable 'ansible_timeout' from source: unknown 28983 1726883104.16152: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883104.16328: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883104.16361: variable 'omit' from source: magic vars 28983 1726883104.16368: starting attempt loop 28983 1726883104.16373: running the handler 28983 1726883104.16390: _low_level_execute_command(): starting 28983 1726883104.16399: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726883104.17149: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883104.17162: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883104.17176: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883104.17315: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883104.17320: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883104.17323: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883104.17340: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883104.17484: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883104.19278: stdout chunk (state=3): >>>/root <<< 28983 1726883104.19532: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883104.19768: stderr chunk (state=3): >>><<< 28983 1726883104.19774: stdout chunk (state=3): >>><<< 28983 1726883104.19802: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883104.19813: _low_level_execute_command(): starting 28983 1726883104.19818: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726883104.1979764-33930-272002071805789 `" && echo ansible-tmp-1726883104.1979764-33930-272002071805789="` echo /root/.ansible/tmp/ansible-tmp-1726883104.1979764-33930-272002071805789 `" ) && sleep 0' 28983 1726883104.21350: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883104.21488: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883104.21596: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883104.21948: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883104.21958: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883104.23893: stdout chunk (state=3): >>>ansible-tmp-1726883104.1979764-33930-272002071805789=/root/.ansible/tmp/ansible-tmp-1726883104.1979764-33930-272002071805789 <<< 28983 1726883104.24059: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883104.24063: stdout chunk (state=3): >>><<< 28983 1726883104.24071: stderr chunk (state=3): >>><<< 28983 1726883104.24094: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726883104.1979764-33930-272002071805789=/root/.ansible/tmp/ansible-tmp-1726883104.1979764-33930-272002071805789 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883104.24141: variable 'ansible_module_compression' from source: unknown 28983 1726883104.24240: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 28983 1726883104.24250: variable 'ansible_facts' from source: unknown 28983 1726883104.24697: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726883104.1979764-33930-272002071805789/AnsiballZ_command.py 28983 1726883104.25053: Sending initial data 28983 1726883104.25057: Sent initial data (156 bytes) 28983 1726883104.26981: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883104.26986: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883104.27480: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883104.27593: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883104.29308: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 28983 1726883104.29323: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 28983 1726883104.29361: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726883104.29469: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726883104.29545: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpj85xqdrm /root/.ansible/tmp/ansible-tmp-1726883104.1979764-33930-272002071805789/AnsiballZ_command.py <<< 28983 1726883104.29568: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726883104.1979764-33930-272002071805789/AnsiballZ_command.py" <<< 28983 1726883104.29614: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpj85xqdrm" to remote "/root/.ansible/tmp/ansible-tmp-1726883104.1979764-33930-272002071805789/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726883104.1979764-33930-272002071805789/AnsiballZ_command.py" <<< 28983 1726883104.32638: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883104.32668: stderr chunk (state=3): >>><<< 28983 1726883104.32677: stdout chunk (state=3): >>><<< 28983 1726883104.32706: done transferring module to remote 28983 1726883104.32928: _low_level_execute_command(): starting 28983 1726883104.32932: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726883104.1979764-33930-272002071805789/ /root/.ansible/tmp/ansible-tmp-1726883104.1979764-33930-272002071805789/AnsiballZ_command.py && sleep 0' 28983 1726883104.34049: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883104.34065: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883104.34124: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883104.34167: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883104.34175: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883104.34374: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883104.36328: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883104.36413: stderr chunk (state=3): >>><<< 28983 1726883104.36443: stdout chunk (state=3): >>><<< 28983 1726883104.36692: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883104.36695: _low_level_execute_command(): starting 28983 1726883104.36697: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726883104.1979764-33930-272002071805789/AnsiballZ_command.py && sleep 0' 28983 1726883104.38076: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883104.38250: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883104.38422: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883104.38489: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883104.38605: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883104.56484: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:45:04.559097", "end": "2024-09-20 21:45:04.562787", "delta": "0:00:00.003690", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 28983 1726883104.58347: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. <<< 28983 1726883104.58354: stdout chunk (state=3): >>><<< 28983 1726883104.58357: stderr chunk (state=3): >>><<< 28983 1726883104.58364: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:45:04.559097", "end": "2024-09-20 21:45:04.562787", "delta": "0:00:00.003690", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726883104.58641: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726883104.1979764-33930-272002071805789/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726883104.58645: _low_level_execute_command(): starting 28983 1726883104.58647: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726883104.1979764-33930-272002071805789/ > /dev/null 2>&1 && sleep 0' 28983 1726883104.59954: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883104.60200: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883104.60212: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883104.60472: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883104.62494: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883104.62510: stdout chunk (state=3): >>><<< 28983 1726883104.62522: stderr chunk (state=3): >>><<< 28983 1726883104.62547: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883104.62940: handler run complete 28983 1726883104.62947: Evaluated conditional (False): False 28983 1726883104.62950: attempt loop complete, returning result 28983 1726883104.62952: _execute() done 28983 1726883104.62959: dumping result to json 28983 1726883104.62962: done dumping result, returning 28983 1726883104.62964: done running TaskExecutor() for managed_node2/TASK: Gather current interface info [0affe814-3a2d-b16d-c0a7-000000002111] 28983 1726883104.62966: sending task result for task 0affe814-3a2d-b16d-c0a7-000000002111 28983 1726883104.63053: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000002111 28983 1726883104.63057: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003690", "end": "2024-09-20 21:45:04.562787", "rc": 0, "start": "2024-09-20 21:45:04.559097" } STDOUT: bonding_masters eth0 lo 28983 1726883104.63162: no more pending results, returning what we have 28983 1726883104.63166: results queue empty 28983 1726883104.63167: checking for any_errors_fatal 28983 1726883104.63170: done checking for any_errors_fatal 28983 1726883104.63171: checking for max_fail_percentage 28983 1726883104.63173: done checking for max_fail_percentage 28983 1726883104.63175: checking to see if all hosts have failed and the running result is not ok 28983 1726883104.63176: done checking to see if all hosts have failed 28983 1726883104.63177: getting the remaining hosts for this loop 28983 1726883104.63179: done getting the remaining hosts for this loop 28983 1726883104.63185: getting the next task for host managed_node2 28983 1726883104.63199: done getting next task for host managed_node2 28983 1726883104.63202: ^ task is: TASK: Set current_interfaces 28983 1726883104.63211: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883104.63217: getting variables 28983 1726883104.63219: in VariableManager get_vars() 28983 1726883104.63674: Calling all_inventory to load vars for managed_node2 28983 1726883104.63681: Calling groups_inventory to load vars for managed_node2 28983 1726883104.63686: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883104.63702: Calling all_plugins_play to load vars for managed_node2 28983 1726883104.63707: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883104.63712: Calling groups_plugins_play to load vars for managed_node2 28983 1726883104.68966: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883104.74843: done with get_vars() 28983 1726883104.74889: done getting variables 28983 1726883104.74963: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 21:45:04 -0400 (0:00:00.635) 0:02:14.747 ****** 28983 1726883104.75007: entering _queue_task() for managed_node2/set_fact 28983 1726883104.75405: worker is 1 (out of 1 available) 28983 1726883104.75420: exiting _queue_task() for managed_node2/set_fact 28983 1726883104.75637: done queuing things up, now waiting for results queue to drain 28983 1726883104.75640: waiting for pending results... 28983 1726883104.75767: running TaskExecutor() for managed_node2/TASK: Set current_interfaces 28983 1726883104.75997: in run() - task 0affe814-3a2d-b16d-c0a7-000000002112 28983 1726883104.76001: variable 'ansible_search_path' from source: unknown 28983 1726883104.76003: variable 'ansible_search_path' from source: unknown 28983 1726883104.76014: calling self._execute() 28983 1726883104.76138: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883104.76153: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883104.76172: variable 'omit' from source: magic vars 28983 1726883104.76631: variable 'ansible_distribution_major_version' from source: facts 28983 1726883104.76760: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883104.76763: variable 'omit' from source: magic vars 28983 1726883104.76765: variable 'omit' from source: magic vars 28983 1726883104.76890: variable '_current_interfaces' from source: set_fact 28983 1726883104.76976: variable 'omit' from source: magic vars 28983 1726883104.77030: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883104.77081: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883104.77113: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883104.77141: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883104.77158: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883104.77212: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883104.77222: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883104.77309: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883104.77382: Set connection var ansible_connection to ssh 28983 1726883104.77402: Set connection var ansible_shell_executable to /bin/sh 28983 1726883104.77422: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883104.77439: Set connection var ansible_timeout to 10 28983 1726883104.77453: Set connection var ansible_pipelining to False 28983 1726883104.77461: Set connection var ansible_shell_type to sh 28983 1726883104.77494: variable 'ansible_shell_executable' from source: unknown 28983 1726883104.77503: variable 'ansible_connection' from source: unknown 28983 1726883104.77511: variable 'ansible_module_compression' from source: unknown 28983 1726883104.77518: variable 'ansible_shell_type' from source: unknown 28983 1726883104.77531: variable 'ansible_shell_executable' from source: unknown 28983 1726883104.77543: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883104.77558: variable 'ansible_pipelining' from source: unknown 28983 1726883104.77573: variable 'ansible_timeout' from source: unknown 28983 1726883104.77637: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883104.77761: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883104.77786: variable 'omit' from source: magic vars 28983 1726883104.77797: starting attempt loop 28983 1726883104.77805: running the handler 28983 1726883104.77822: handler run complete 28983 1726883104.77843: attempt loop complete, returning result 28983 1726883104.77857: _execute() done 28983 1726883104.77865: dumping result to json 28983 1726883104.77962: done dumping result, returning 28983 1726883104.77966: done running TaskExecutor() for managed_node2/TASK: Set current_interfaces [0affe814-3a2d-b16d-c0a7-000000002112] 28983 1726883104.77968: sending task result for task 0affe814-3a2d-b16d-c0a7-000000002112 28983 1726883104.78057: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000002112 28983 1726883104.78061: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 28983 1726883104.78244: no more pending results, returning what we have 28983 1726883104.78248: results queue empty 28983 1726883104.78249: checking for any_errors_fatal 28983 1726883104.78265: done checking for any_errors_fatal 28983 1726883104.78267: checking for max_fail_percentage 28983 1726883104.78269: done checking for max_fail_percentage 28983 1726883104.78273: checking to see if all hosts have failed and the running result is not ok 28983 1726883104.78274: done checking to see if all hosts have failed 28983 1726883104.78275: getting the remaining hosts for this loop 28983 1726883104.78278: done getting the remaining hosts for this loop 28983 1726883104.78285: getting the next task for host managed_node2 28983 1726883104.78296: done getting next task for host managed_node2 28983 1726883104.78299: ^ task is: TASK: Show current_interfaces 28983 1726883104.78305: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883104.78309: getting variables 28983 1726883104.78311: in VariableManager get_vars() 28983 1726883104.78559: Calling all_inventory to load vars for managed_node2 28983 1726883104.78563: Calling groups_inventory to load vars for managed_node2 28983 1726883104.78567: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883104.78579: Calling all_plugins_play to load vars for managed_node2 28983 1726883104.78583: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883104.78587: Calling groups_plugins_play to load vars for managed_node2 28983 1726883104.80810: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883104.83936: done with get_vars() 28983 1726883104.83979: done getting variables 28983 1726883104.84056: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 21:45:04 -0400 (0:00:00.090) 0:02:14.838 ****** 28983 1726883104.84099: entering _queue_task() for managed_node2/debug 28983 1726883104.84651: worker is 1 (out of 1 available) 28983 1726883104.84663: exiting _queue_task() for managed_node2/debug 28983 1726883104.84675: done queuing things up, now waiting for results queue to drain 28983 1726883104.84677: waiting for pending results... 28983 1726883104.84854: running TaskExecutor() for managed_node2/TASK: Show current_interfaces 28983 1726883104.84956: in run() - task 0affe814-3a2d-b16d-c0a7-0000000020d7 28983 1726883104.84983: variable 'ansible_search_path' from source: unknown 28983 1726883104.84992: variable 'ansible_search_path' from source: unknown 28983 1726883104.85122: calling self._execute() 28983 1726883104.85169: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883104.85189: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883104.85229: variable 'omit' from source: magic vars 28983 1726883104.85998: variable 'ansible_distribution_major_version' from source: facts 28983 1726883104.86003: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883104.86006: variable 'omit' from source: magic vars 28983 1726883104.86140: variable 'omit' from source: magic vars 28983 1726883104.86317: variable 'current_interfaces' from source: set_fact 28983 1726883104.86462: variable 'omit' from source: magic vars 28983 1726883104.86516: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883104.86741: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883104.86744: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883104.86747: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883104.86832: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883104.86882: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883104.86895: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883104.86904: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883104.87047: Set connection var ansible_connection to ssh 28983 1726883104.87069: Set connection var ansible_shell_executable to /bin/sh 28983 1726883104.87093: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883104.87113: Set connection var ansible_timeout to 10 28983 1726883104.87130: Set connection var ansible_pipelining to False 28983 1726883104.87141: Set connection var ansible_shell_type to sh 28983 1726883104.87175: variable 'ansible_shell_executable' from source: unknown 28983 1726883104.87187: variable 'ansible_connection' from source: unknown 28983 1726883104.87194: variable 'ansible_module_compression' from source: unknown 28983 1726883104.87202: variable 'ansible_shell_type' from source: unknown 28983 1726883104.87210: variable 'ansible_shell_executable' from source: unknown 28983 1726883104.87218: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883104.87226: variable 'ansible_pipelining' from source: unknown 28983 1726883104.87291: variable 'ansible_timeout' from source: unknown 28983 1726883104.87294: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883104.87428: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883104.87448: variable 'omit' from source: magic vars 28983 1726883104.87458: starting attempt loop 28983 1726883104.87466: running the handler 28983 1726883104.87529: handler run complete 28983 1726883104.87557: attempt loop complete, returning result 28983 1726883104.87565: _execute() done 28983 1726883104.87575: dumping result to json 28983 1726883104.87583: done dumping result, returning 28983 1726883104.87595: done running TaskExecutor() for managed_node2/TASK: Show current_interfaces [0affe814-3a2d-b16d-c0a7-0000000020d7] 28983 1726883104.87615: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000020d7 ok: [managed_node2] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 28983 1726883104.87776: no more pending results, returning what we have 28983 1726883104.87781: results queue empty 28983 1726883104.87782: checking for any_errors_fatal 28983 1726883104.87794: done checking for any_errors_fatal 28983 1726883104.87795: checking for max_fail_percentage 28983 1726883104.87797: done checking for max_fail_percentage 28983 1726883104.87798: checking to see if all hosts have failed and the running result is not ok 28983 1726883104.87799: done checking to see if all hosts have failed 28983 1726883104.87800: getting the remaining hosts for this loop 28983 1726883104.87802: done getting the remaining hosts for this loop 28983 1726883104.87808: getting the next task for host managed_node2 28983 1726883104.87821: done getting next task for host managed_node2 28983 1726883104.87825: ^ task is: TASK: Setup 28983 1726883104.87829: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883104.87836: getting variables 28983 1726883104.87838: in VariableManager get_vars() 28983 1726883104.87891: Calling all_inventory to load vars for managed_node2 28983 1726883104.87894: Calling groups_inventory to load vars for managed_node2 28983 1726883104.87899: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883104.87910: Calling all_plugins_play to load vars for managed_node2 28983 1726883104.87915: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883104.87919: Calling groups_plugins_play to load vars for managed_node2 28983 1726883104.88778: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000020d7 28983 1726883104.88781: WORKER PROCESS EXITING 28983 1726883104.90737: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883104.94013: done with get_vars() 28983 1726883104.94051: done getting variables TASK [Setup] ******************************************************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:24 Friday 20 September 2024 21:45:04 -0400 (0:00:00.100) 0:02:14.939 ****** 28983 1726883104.94159: entering _queue_task() for managed_node2/include_tasks 28983 1726883104.94480: worker is 1 (out of 1 available) 28983 1726883104.94493: exiting _queue_task() for managed_node2/include_tasks 28983 1726883104.94505: done queuing things up, now waiting for results queue to drain 28983 1726883104.94507: waiting for pending results... 28983 1726883104.94825: running TaskExecutor() for managed_node2/TASK: Setup 28983 1726883104.94968: in run() - task 0affe814-3a2d-b16d-c0a7-0000000020b0 28983 1726883104.94994: variable 'ansible_search_path' from source: unknown 28983 1726883104.95007: variable 'ansible_search_path' from source: unknown 28983 1726883104.95072: variable 'lsr_setup' from source: include params 28983 1726883104.95321: variable 'lsr_setup' from source: include params 28983 1726883104.95412: variable 'omit' from source: magic vars 28983 1726883104.95588: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883104.95612: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883104.95631: variable 'omit' from source: magic vars 28983 1726883104.95943: variable 'ansible_distribution_major_version' from source: facts 28983 1726883104.95959: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883104.95974: variable 'item' from source: unknown 28983 1726883104.96065: variable 'item' from source: unknown 28983 1726883104.96113: variable 'item' from source: unknown 28983 1726883104.96356: variable 'item' from source: unknown 28983 1726883104.96476: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883104.96480: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883104.96483: variable 'omit' from source: magic vars 28983 1726883104.97243: variable 'ansible_distribution_major_version' from source: facts 28983 1726883104.97249: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883104.97252: variable 'item' from source: unknown 28983 1726883104.97255: variable 'item' from source: unknown 28983 1726883104.97284: variable 'item' from source: unknown 28983 1726883104.97543: variable 'item' from source: unknown 28983 1726883104.98043: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883104.98046: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883104.98049: variable 'omit' from source: magic vars 28983 1726883104.98560: variable 'ansible_distribution_major_version' from source: facts 28983 1726883104.98564: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883104.98566: variable 'item' from source: unknown 28983 1726883104.98840: variable 'item' from source: unknown 28983 1726883104.98843: variable 'item' from source: unknown 28983 1726883104.99033: variable 'item' from source: unknown 28983 1726883104.99097: dumping result to json 28983 1726883104.99111: done dumping result, returning 28983 1726883104.99439: done running TaskExecutor() for managed_node2/TASK: Setup [0affe814-3a2d-b16d-c0a7-0000000020b0] 28983 1726883104.99445: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000020b0 28983 1726883104.99492: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000020b0 28983 1726883104.99496: WORKER PROCESS EXITING 28983 1726883104.99591: no more pending results, returning what we have 28983 1726883104.99597: in VariableManager get_vars() 28983 1726883104.99658: Calling all_inventory to load vars for managed_node2 28983 1726883104.99662: Calling groups_inventory to load vars for managed_node2 28983 1726883104.99666: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883104.99685: Calling all_plugins_play to load vars for managed_node2 28983 1726883104.99690: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883104.99694: Calling groups_plugins_play to load vars for managed_node2 28983 1726883105.03275: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883105.08050: done with get_vars() 28983 1726883105.08092: variable 'ansible_search_path' from source: unknown 28983 1726883105.08093: variable 'ansible_search_path' from source: unknown 28983 1726883105.08149: variable 'ansible_search_path' from source: unknown 28983 1726883105.08151: variable 'ansible_search_path' from source: unknown 28983 1726883105.08192: variable 'ansible_search_path' from source: unknown 28983 1726883105.08193: variable 'ansible_search_path' from source: unknown 28983 1726883105.08243: we have included files to process 28983 1726883105.08244: generating all_blocks data 28983 1726883105.08247: done generating all_blocks data 28983 1726883105.08253: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 28983 1726883105.08254: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 28983 1726883105.08257: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 28983 1726883105.08595: done processing included file 28983 1726883105.08597: iterating over new_blocks loaded from include file 28983 1726883105.08599: in VariableManager get_vars() 28983 1726883105.08620: done with get_vars() 28983 1726883105.08622: filtering new block on tags 28983 1726883105.08685: done filtering new block on tags 28983 1726883105.08688: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml for managed_node2 => (item=tasks/create_bridge_profile.yml) 28983 1726883105.08694: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml 28983 1726883105.08696: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml 28983 1726883105.08699: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml 28983 1726883105.08819: done processing included file 28983 1726883105.08821: iterating over new_blocks loaded from include file 28983 1726883105.08823: in VariableManager get_vars() 28983 1726883105.08844: done with get_vars() 28983 1726883105.08846: filtering new block on tags 28983 1726883105.08889: done filtering new block on tags 28983 1726883105.08893: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml for managed_node2 => (item=tasks/activate_profile.yml) 28983 1726883105.08897: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml 28983 1726883105.08899: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml 28983 1726883105.08902: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml 28983 1726883105.09040: done processing included file 28983 1726883105.09043: iterating over new_blocks loaded from include file 28983 1726883105.09044: in VariableManager get_vars() 28983 1726883105.09065: done with get_vars() 28983 1726883105.09068: filtering new block on tags 28983 1726883105.09110: done filtering new block on tags 28983 1726883105.09113: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml for managed_node2 => (item=tasks/remove+down_profile.yml) 28983 1726883105.09117: extending task lists for all hosts with included blocks 28983 1726883105.10317: done extending task lists 28983 1726883105.10319: done processing included files 28983 1726883105.10320: results queue empty 28983 1726883105.10321: checking for any_errors_fatal 28983 1726883105.10325: done checking for any_errors_fatal 28983 1726883105.10326: checking for max_fail_percentage 28983 1726883105.10328: done checking for max_fail_percentage 28983 1726883105.10329: checking to see if all hosts have failed and the running result is not ok 28983 1726883105.10330: done checking to see if all hosts have failed 28983 1726883105.10331: getting the remaining hosts for this loop 28983 1726883105.10332: done getting the remaining hosts for this loop 28983 1726883105.10337: getting the next task for host managed_node2 28983 1726883105.10342: done getting next task for host managed_node2 28983 1726883105.10345: ^ task is: TASK: Include network role 28983 1726883105.10349: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883105.10352: getting variables 28983 1726883105.10353: in VariableManager get_vars() 28983 1726883105.10366: Calling all_inventory to load vars for managed_node2 28983 1726883105.10369: Calling groups_inventory to load vars for managed_node2 28983 1726883105.10375: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883105.10381: Calling all_plugins_play to load vars for managed_node2 28983 1726883105.10389: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883105.10394: Calling groups_plugins_play to load vars for managed_node2 28983 1726883105.12674: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883105.15900: done with get_vars() 28983 1726883105.15936: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml:3 Friday 20 September 2024 21:45:05 -0400 (0:00:00.218) 0:02:15.158 ****** 28983 1726883105.16036: entering _queue_task() for managed_node2/include_role 28983 1726883105.16574: worker is 1 (out of 1 available) 28983 1726883105.16588: exiting _queue_task() for managed_node2/include_role 28983 1726883105.16602: done queuing things up, now waiting for results queue to drain 28983 1726883105.16604: waiting for pending results... 28983 1726883105.16926: running TaskExecutor() for managed_node2/TASK: Include network role 28983 1726883105.17141: in run() - task 0affe814-3a2d-b16d-c0a7-000000002139 28983 1726883105.17170: variable 'ansible_search_path' from source: unknown 28983 1726883105.17188: variable 'ansible_search_path' from source: unknown 28983 1726883105.17243: calling self._execute() 28983 1726883105.17393: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883105.17418: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883105.17443: variable 'omit' from source: magic vars 28983 1726883105.18053: variable 'ansible_distribution_major_version' from source: facts 28983 1726883105.18074: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883105.18090: _execute() done 28983 1726883105.18175: dumping result to json 28983 1726883105.18180: done dumping result, returning 28983 1726883105.18184: done running TaskExecutor() for managed_node2/TASK: Include network role [0affe814-3a2d-b16d-c0a7-000000002139] 28983 1726883105.18189: sending task result for task 0affe814-3a2d-b16d-c0a7-000000002139 28983 1726883105.18366: no more pending results, returning what we have 28983 1726883105.18375: in VariableManager get_vars() 28983 1726883105.18448: Calling all_inventory to load vars for managed_node2 28983 1726883105.18453: Calling groups_inventory to load vars for managed_node2 28983 1726883105.18458: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883105.18477: Calling all_plugins_play to load vars for managed_node2 28983 1726883105.18483: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883105.18488: Calling groups_plugins_play to load vars for managed_node2 28983 1726883105.19342: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000002139 28983 1726883105.19346: WORKER PROCESS EXITING 28983 1726883105.23204: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883105.26817: done with get_vars() 28983 1726883105.26858: variable 'ansible_search_path' from source: unknown 28983 1726883105.26859: variable 'ansible_search_path' from source: unknown 28983 1726883105.27480: variable 'omit' from source: magic vars 28983 1726883105.27639: variable 'omit' from source: magic vars 28983 1726883105.27660: variable 'omit' from source: magic vars 28983 1726883105.27782: we have included files to process 28983 1726883105.27784: generating all_blocks data 28983 1726883105.27786: done generating all_blocks data 28983 1726883105.27788: processing included file: fedora.linux_system_roles.network 28983 1726883105.27818: in VariableManager get_vars() 28983 1726883105.27861: done with get_vars() 28983 1726883105.28014: in VariableManager get_vars() 28983 1726883105.28062: done with get_vars() 28983 1726883105.28116: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 28983 1726883105.28426: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 28983 1726883105.28775: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 28983 1726883105.30397: in VariableManager get_vars() 28983 1726883105.30425: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 28983 1726883105.33942: iterating over new_blocks loaded from include file 28983 1726883105.33945: in VariableManager get_vars() 28983 1726883105.33969: done with get_vars() 28983 1726883105.33970: filtering new block on tags 28983 1726883105.34407: done filtering new block on tags 28983 1726883105.34411: in VariableManager get_vars() 28983 1726883105.34458: done with get_vars() 28983 1726883105.34460: filtering new block on tags 28983 1726883105.34482: done filtering new block on tags 28983 1726883105.34485: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node2 28983 1726883105.34491: extending task lists for all hosts with included blocks 28983 1726883105.35057: done extending task lists 28983 1726883105.35059: done processing included files 28983 1726883105.35060: results queue empty 28983 1726883105.35061: checking for any_errors_fatal 28983 1726883105.35065: done checking for any_errors_fatal 28983 1726883105.35066: checking for max_fail_percentage 28983 1726883105.35068: done checking for max_fail_percentage 28983 1726883105.35069: checking to see if all hosts have failed and the running result is not ok 28983 1726883105.35070: done checking to see if all hosts have failed 28983 1726883105.35071: getting the remaining hosts for this loop 28983 1726883105.35072: done getting the remaining hosts for this loop 28983 1726883105.35075: getting the next task for host managed_node2 28983 1726883105.35081: done getting next task for host managed_node2 28983 1726883105.35084: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 28983 1726883105.35093: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883105.35106: getting variables 28983 1726883105.35107: in VariableManager get_vars() 28983 1726883105.35125: Calling all_inventory to load vars for managed_node2 28983 1726883105.35128: Calling groups_inventory to load vars for managed_node2 28983 1726883105.35131: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883105.35205: Calling all_plugins_play to load vars for managed_node2 28983 1726883105.35209: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883105.35213: Calling groups_plugins_play to load vars for managed_node2 28983 1726883105.37678: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883105.40832: done with get_vars() 28983 1726883105.40876: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:45:05 -0400 (0:00:00.249) 0:02:15.407 ****** 28983 1726883105.40968: entering _queue_task() for managed_node2/include_tasks 28983 1726883105.41377: worker is 1 (out of 1 available) 28983 1726883105.41390: exiting _queue_task() for managed_node2/include_tasks 28983 1726883105.41403: done queuing things up, now waiting for results queue to drain 28983 1726883105.41404: waiting for pending results... 28983 1726883105.42132: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 28983 1726883105.42626: in run() - task 0affe814-3a2d-b16d-c0a7-0000000021a3 28983 1726883105.42635: variable 'ansible_search_path' from source: unknown 28983 1726883105.42638: variable 'ansible_search_path' from source: unknown 28983 1726883105.42642: calling self._execute() 28983 1726883105.42943: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883105.43143: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883105.43146: variable 'omit' from source: magic vars 28983 1726883105.43766: variable 'ansible_distribution_major_version' from source: facts 28983 1726883105.43779: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883105.43785: _execute() done 28983 1726883105.43791: dumping result to json 28983 1726883105.43794: done dumping result, returning 28983 1726883105.43804: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affe814-3a2d-b16d-c0a7-0000000021a3] 28983 1726883105.43885: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000021a3 28983 1726883105.43958: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000021a3 28983 1726883105.43961: WORKER PROCESS EXITING 28983 1726883105.44017: no more pending results, returning what we have 28983 1726883105.44025: in VariableManager get_vars() 28983 1726883105.44094: Calling all_inventory to load vars for managed_node2 28983 1726883105.44099: Calling groups_inventory to load vars for managed_node2 28983 1726883105.44102: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883105.44115: Calling all_plugins_play to load vars for managed_node2 28983 1726883105.44120: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883105.44124: Calling groups_plugins_play to load vars for managed_node2 28983 1726883105.46778: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883105.50762: done with get_vars() 28983 1726883105.50800: variable 'ansible_search_path' from source: unknown 28983 1726883105.50802: variable 'ansible_search_path' from source: unknown 28983 1726883105.50929: we have included files to process 28983 1726883105.50930: generating all_blocks data 28983 1726883105.50932: done generating all_blocks data 28983 1726883105.50940: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 28983 1726883105.50942: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 28983 1726883105.50945: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 28983 1726883105.51813: done processing included file 28983 1726883105.51815: iterating over new_blocks loaded from include file 28983 1726883105.51817: in VariableManager get_vars() 28983 1726883105.51860: done with get_vars() 28983 1726883105.51863: filtering new block on tags 28983 1726883105.51903: done filtering new block on tags 28983 1726883105.51907: in VariableManager get_vars() 28983 1726883105.51942: done with get_vars() 28983 1726883105.51944: filtering new block on tags 28983 1726883105.52013: done filtering new block on tags 28983 1726883105.52017: in VariableManager get_vars() 28983 1726883105.52051: done with get_vars() 28983 1726883105.52053: filtering new block on tags 28983 1726883105.52123: done filtering new block on tags 28983 1726883105.52127: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node2 28983 1726883105.52133: extending task lists for all hosts with included blocks 28983 1726883105.54761: done extending task lists 28983 1726883105.54763: done processing included files 28983 1726883105.54764: results queue empty 28983 1726883105.54765: checking for any_errors_fatal 28983 1726883105.54769: done checking for any_errors_fatal 28983 1726883105.54769: checking for max_fail_percentage 28983 1726883105.54771: done checking for max_fail_percentage 28983 1726883105.54772: checking to see if all hosts have failed and the running result is not ok 28983 1726883105.54773: done checking to see if all hosts have failed 28983 1726883105.54774: getting the remaining hosts for this loop 28983 1726883105.54776: done getting the remaining hosts for this loop 28983 1726883105.54779: getting the next task for host managed_node2 28983 1726883105.54786: done getting next task for host managed_node2 28983 1726883105.54796: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 28983 1726883105.54802: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883105.54815: getting variables 28983 1726883105.54817: in VariableManager get_vars() 28983 1726883105.54837: Calling all_inventory to load vars for managed_node2 28983 1726883105.54840: Calling groups_inventory to load vars for managed_node2 28983 1726883105.54843: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883105.54850: Calling all_plugins_play to load vars for managed_node2 28983 1726883105.54853: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883105.54857: Calling groups_plugins_play to load vars for managed_node2 28983 1726883105.58939: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883105.65371: done with get_vars() 28983 1726883105.65415: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:45:05 -0400 (0:00:00.246) 0:02:15.654 ****** 28983 1726883105.65742: entering _queue_task() for managed_node2/setup 28983 1726883105.66497: worker is 1 (out of 1 available) 28983 1726883105.66512: exiting _queue_task() for managed_node2/setup 28983 1726883105.66528: done queuing things up, now waiting for results queue to drain 28983 1726883105.66530: waiting for pending results... 28983 1726883105.67305: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 28983 1726883105.67699: in run() - task 0affe814-3a2d-b16d-c0a7-000000002200 28983 1726883105.67704: variable 'ansible_search_path' from source: unknown 28983 1726883105.67707: variable 'ansible_search_path' from source: unknown 28983 1726883105.67880: calling self._execute() 28983 1726883105.68157: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883105.68162: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883105.68175: variable 'omit' from source: magic vars 28983 1726883105.69265: variable 'ansible_distribution_major_version' from source: facts 28983 1726883105.69386: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883105.69788: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883105.85324: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883105.85419: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883105.85479: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883105.85522: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883105.85576: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883105.85683: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883105.85723: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883105.85762: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883105.85823: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883105.85855: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883105.85945: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883105.85978: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883105.86140: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883105.86158: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883105.86181: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883105.86567: variable '__network_required_facts' from source: role '' defaults 28983 1726883105.86583: variable 'ansible_facts' from source: unknown 28983 1726883105.87873: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 28983 1726883105.87884: when evaluation is False, skipping this task 28983 1726883105.87887: _execute() done 28983 1726883105.87894: dumping result to json 28983 1726883105.87897: done dumping result, returning 28983 1726883105.87905: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affe814-3a2d-b16d-c0a7-000000002200] 28983 1726883105.87908: sending task result for task 0affe814-3a2d-b16d-c0a7-000000002200 28983 1726883105.88222: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000002200 28983 1726883105.88226: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28983 1726883105.88284: no more pending results, returning what we have 28983 1726883105.88288: results queue empty 28983 1726883105.88289: checking for any_errors_fatal 28983 1726883105.88291: done checking for any_errors_fatal 28983 1726883105.88292: checking for max_fail_percentage 28983 1726883105.88294: done checking for max_fail_percentage 28983 1726883105.88295: checking to see if all hosts have failed and the running result is not ok 28983 1726883105.88296: done checking to see if all hosts have failed 28983 1726883105.88297: getting the remaining hosts for this loop 28983 1726883105.88299: done getting the remaining hosts for this loop 28983 1726883105.88304: getting the next task for host managed_node2 28983 1726883105.88319: done getting next task for host managed_node2 28983 1726883105.88323: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 28983 1726883105.88331: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883105.88356: getting variables 28983 1726883105.88358: in VariableManager get_vars() 28983 1726883105.88415: Calling all_inventory to load vars for managed_node2 28983 1726883105.88419: Calling groups_inventory to load vars for managed_node2 28983 1726883105.88422: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883105.88433: Calling all_plugins_play to load vars for managed_node2 28983 1726883105.88750: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883105.88769: Calling groups_plugins_play to load vars for managed_node2 28983 1726883105.97843: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883106.00094: done with get_vars() 28983 1726883106.00126: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:45:06 -0400 (0:00:00.345) 0:02:16.000 ****** 28983 1726883106.00223: entering _queue_task() for managed_node2/stat 28983 1726883106.00558: worker is 1 (out of 1 available) 28983 1726883106.00573: exiting _queue_task() for managed_node2/stat 28983 1726883106.00588: done queuing things up, now waiting for results queue to drain 28983 1726883106.00590: waiting for pending results... 28983 1726883106.00821: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 28983 1726883106.00974: in run() - task 0affe814-3a2d-b16d-c0a7-000000002202 28983 1726883106.00991: variable 'ansible_search_path' from source: unknown 28983 1726883106.00998: variable 'ansible_search_path' from source: unknown 28983 1726883106.01086: calling self._execute() 28983 1726883106.01142: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883106.01151: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883106.01165: variable 'omit' from source: magic vars 28983 1726883106.01563: variable 'ansible_distribution_major_version' from source: facts 28983 1726883106.01577: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883106.01784: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883106.02040: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883106.02108: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883106.02203: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883106.02245: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883106.02361: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883106.02393: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883106.02419: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883106.02447: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883106.02535: variable '__network_is_ostree' from source: set_fact 28983 1726883106.02542: Evaluated conditional (not __network_is_ostree is defined): False 28983 1726883106.02547: when evaluation is False, skipping this task 28983 1726883106.02550: _execute() done 28983 1726883106.02556: dumping result to json 28983 1726883106.02560: done dumping result, returning 28983 1726883106.02568: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affe814-3a2d-b16d-c0a7-000000002202] 28983 1726883106.02575: sending task result for task 0affe814-3a2d-b16d-c0a7-000000002202 skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 28983 1726883106.02737: no more pending results, returning what we have 28983 1726883106.02741: results queue empty 28983 1726883106.02742: checking for any_errors_fatal 28983 1726883106.02755: done checking for any_errors_fatal 28983 1726883106.02756: checking for max_fail_percentage 28983 1726883106.02758: done checking for max_fail_percentage 28983 1726883106.02760: checking to see if all hosts have failed and the running result is not ok 28983 1726883106.02761: done checking to see if all hosts have failed 28983 1726883106.02762: getting the remaining hosts for this loop 28983 1726883106.02764: done getting the remaining hosts for this loop 28983 1726883106.02770: getting the next task for host managed_node2 28983 1726883106.02780: done getting next task for host managed_node2 28983 1726883106.02784: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 28983 1726883106.02791: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883106.02814: getting variables 28983 1726883106.02816: in VariableManager get_vars() 28983 1726883106.02875: Calling all_inventory to load vars for managed_node2 28983 1726883106.02879: Calling groups_inventory to load vars for managed_node2 28983 1726883106.02882: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883106.02889: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000002202 28983 1726883106.02893: WORKER PROCESS EXITING 28983 1726883106.02901: Calling all_plugins_play to load vars for managed_node2 28983 1726883106.02904: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883106.02907: Calling groups_plugins_play to load vars for managed_node2 28983 1726883106.04316: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883106.05977: done with get_vars() 28983 1726883106.06005: done getting variables 28983 1726883106.06055: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:45:06 -0400 (0:00:00.058) 0:02:16.058 ****** 28983 1726883106.06090: entering _queue_task() for managed_node2/set_fact 28983 1726883106.06356: worker is 1 (out of 1 available) 28983 1726883106.06373: exiting _queue_task() for managed_node2/set_fact 28983 1726883106.06386: done queuing things up, now waiting for results queue to drain 28983 1726883106.06388: waiting for pending results... 28983 1726883106.06592: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 28983 1726883106.06941: in run() - task 0affe814-3a2d-b16d-c0a7-000000002203 28983 1726883106.06946: variable 'ansible_search_path' from source: unknown 28983 1726883106.06950: variable 'ansible_search_path' from source: unknown 28983 1726883106.06953: calling self._execute() 28983 1726883106.07051: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883106.07084: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883106.07109: variable 'omit' from source: magic vars 28983 1726883106.07751: variable 'ansible_distribution_major_version' from source: facts 28983 1726883106.07776: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883106.08237: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883106.08482: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883106.08557: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883106.08674: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883106.08740: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883106.08884: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883106.08944: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883106.08989: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883106.09053: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883106.09215: variable '__network_is_ostree' from source: set_fact 28983 1726883106.09249: Evaluated conditional (not __network_is_ostree is defined): False 28983 1726883106.09264: when evaluation is False, skipping this task 28983 1726883106.09345: _execute() done 28983 1726883106.09349: dumping result to json 28983 1726883106.09358: done dumping result, returning 28983 1726883106.09362: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affe814-3a2d-b16d-c0a7-000000002203] 28983 1726883106.09365: sending task result for task 0affe814-3a2d-b16d-c0a7-000000002203 skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 28983 1726883106.09608: no more pending results, returning what we have 28983 1726883106.09615: results queue empty 28983 1726883106.09616: checking for any_errors_fatal 28983 1726883106.09625: done checking for any_errors_fatal 28983 1726883106.09626: checking for max_fail_percentage 28983 1726883106.09628: done checking for max_fail_percentage 28983 1726883106.09629: checking to see if all hosts have failed and the running result is not ok 28983 1726883106.09630: done checking to see if all hosts have failed 28983 1726883106.09631: getting the remaining hosts for this loop 28983 1726883106.09638: done getting the remaining hosts for this loop 28983 1726883106.09839: getting the next task for host managed_node2 28983 1726883106.09857: done getting next task for host managed_node2 28983 1726883106.09862: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 28983 1726883106.09869: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883106.09896: getting variables 28983 1726883106.09898: in VariableManager get_vars() 28983 1726883106.09952: Calling all_inventory to load vars for managed_node2 28983 1726883106.09956: Calling groups_inventory to load vars for managed_node2 28983 1726883106.09959: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883106.09970: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000002203 28983 1726883106.09977: WORKER PROCESS EXITING 28983 1726883106.09993: Calling all_plugins_play to load vars for managed_node2 28983 1726883106.09998: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883106.10007: Calling groups_plugins_play to load vars for managed_node2 28983 1726883106.12880: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883106.15142: done with get_vars() 28983 1726883106.15165: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:45:06 -0400 (0:00:00.091) 0:02:16.150 ****** 28983 1726883106.15249: entering _queue_task() for managed_node2/service_facts 28983 1726883106.15488: worker is 1 (out of 1 available) 28983 1726883106.15502: exiting _queue_task() for managed_node2/service_facts 28983 1726883106.15514: done queuing things up, now waiting for results queue to drain 28983 1726883106.15516: waiting for pending results... 28983 1726883106.15754: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running 28983 1726883106.15932: in run() - task 0affe814-3a2d-b16d-c0a7-000000002205 28983 1726883106.15947: variable 'ansible_search_path' from source: unknown 28983 1726883106.15951: variable 'ansible_search_path' from source: unknown 28983 1726883106.15987: calling self._execute() 28983 1726883106.16093: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883106.16099: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883106.16112: variable 'omit' from source: magic vars 28983 1726883106.16506: variable 'ansible_distribution_major_version' from source: facts 28983 1726883106.16516: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883106.16528: variable 'omit' from source: magic vars 28983 1726883106.16596: variable 'omit' from source: magic vars 28983 1726883106.16643: variable 'omit' from source: magic vars 28983 1726883106.16680: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883106.16712: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883106.16730: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883106.16752: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883106.16761: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883106.16794: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883106.16798: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883106.16801: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883106.16926: Set connection var ansible_connection to ssh 28983 1726883106.16938: Set connection var ansible_shell_executable to /bin/sh 28983 1726883106.16948: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883106.16956: Set connection var ansible_timeout to 10 28983 1726883106.16963: Set connection var ansible_pipelining to False 28983 1726883106.16966: Set connection var ansible_shell_type to sh 28983 1726883106.16991: variable 'ansible_shell_executable' from source: unknown 28983 1726883106.16995: variable 'ansible_connection' from source: unknown 28983 1726883106.16998: variable 'ansible_module_compression' from source: unknown 28983 1726883106.17001: variable 'ansible_shell_type' from source: unknown 28983 1726883106.17006: variable 'ansible_shell_executable' from source: unknown 28983 1726883106.17008: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883106.17012: variable 'ansible_pipelining' from source: unknown 28983 1726883106.17014: variable 'ansible_timeout' from source: unknown 28983 1726883106.17017: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883106.17186: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28983 1726883106.17195: variable 'omit' from source: magic vars 28983 1726883106.17201: starting attempt loop 28983 1726883106.17206: running the handler 28983 1726883106.17219: _low_level_execute_command(): starting 28983 1726883106.17227: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726883106.17772: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883106.17776: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883106.17780: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883106.17782: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883106.17825: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883106.17830: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883106.17851: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883106.17937: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883106.19713: stdout chunk (state=3): >>>/root <<< 28983 1726883106.19850: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883106.19893: stderr chunk (state=3): >>><<< 28983 1726883106.19898: stdout chunk (state=3): >>><<< 28983 1726883106.19921: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883106.19937: _low_level_execute_command(): starting 28983 1726883106.19944: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726883106.1992152-34003-166331571650218 `" && echo ansible-tmp-1726883106.1992152-34003-166331571650218="` echo /root/.ansible/tmp/ansible-tmp-1726883106.1992152-34003-166331571650218 `" ) && sleep 0' 28983 1726883106.20405: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883106.20408: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883106.20411: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883106.20419: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883106.20422: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883106.20476: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883106.20481: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883106.20556: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883106.22602: stdout chunk (state=3): >>>ansible-tmp-1726883106.1992152-34003-166331571650218=/root/.ansible/tmp/ansible-tmp-1726883106.1992152-34003-166331571650218 <<< 28983 1726883106.22724: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883106.22772: stderr chunk (state=3): >>><<< 28983 1726883106.22776: stdout chunk (state=3): >>><<< 28983 1726883106.22804: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726883106.1992152-34003-166331571650218=/root/.ansible/tmp/ansible-tmp-1726883106.1992152-34003-166331571650218 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883106.22837: variable 'ansible_module_compression' from source: unknown 28983 1726883106.22881: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 28983 1726883106.22924: variable 'ansible_facts' from source: unknown 28983 1726883106.22992: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726883106.1992152-34003-166331571650218/AnsiballZ_service_facts.py 28983 1726883106.23149: Sending initial data 28983 1726883106.23153: Sent initial data (162 bytes) 28983 1726883106.23696: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883106.23699: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726883106.23703: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883106.23707: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found <<< 28983 1726883106.23710: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883106.23796: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883106.23860: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883106.25539: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726883106.25608: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726883106.25692: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmp6mklng49 /root/.ansible/tmp/ansible-tmp-1726883106.1992152-34003-166331571650218/AnsiballZ_service_facts.py <<< 28983 1726883106.25695: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726883106.1992152-34003-166331571650218/AnsiballZ_service_facts.py" <<< 28983 1726883106.25771: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmp6mklng49" to remote "/root/.ansible/tmp/ansible-tmp-1726883106.1992152-34003-166331571650218/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726883106.1992152-34003-166331571650218/AnsiballZ_service_facts.py" <<< 28983 1726883106.26902: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883106.27022: stderr chunk (state=3): >>><<< 28983 1726883106.27025: stdout chunk (state=3): >>><<< 28983 1726883106.27027: done transferring module to remote 28983 1726883106.27029: _low_level_execute_command(): starting 28983 1726883106.27032: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726883106.1992152-34003-166331571650218/ /root/.ansible/tmp/ansible-tmp-1726883106.1992152-34003-166331571650218/AnsiballZ_service_facts.py && sleep 0' 28983 1726883106.27526: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883106.27529: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883106.27532: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883106.27536: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883106.27592: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883106.27595: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883106.27662: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883106.29839: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883106.29843: stdout chunk (state=3): >>><<< 28983 1726883106.29845: stderr chunk (state=3): >>><<< 28983 1726883106.29848: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883106.29851: _low_level_execute_command(): starting 28983 1726883106.29853: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726883106.1992152-34003-166331571650218/AnsiballZ_service_facts.py && sleep 0' 28983 1726883106.30450: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883106.30495: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883106.30516: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883106.30546: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883106.30720: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883108.27869: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"network": {"name": "network", "state": "running", "status": "enabled", "source": "sysv"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "import-state.service": {"name": "import-state.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "ip6tables.service": {"name": "ip6tables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "iptables.service": {"name": "iptables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service<<< 28983 1726883108.27886: stdout chunk (state=3): >>>", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "generated", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-login<<< 28983 1726883108.27922: stdout chunk (state=3): >>>d.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "stat<<< 28983 1726883108.27944: stdout chunk (state=3): >>>ic", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "loadmodules.service": {"name": "loadmodules.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "stati<<< 28983 1726883108.27953: stdout chunk (state=3): >>>c", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 28983 1726883108.29734: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. <<< 28983 1726883108.29766: stderr chunk (state=3): >>><<< 28983 1726883108.29769: stdout chunk (state=3): >>><<< 28983 1726883108.29801: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"network": {"name": "network", "state": "running", "status": "enabled", "source": "sysv"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "import-state.service": {"name": "import-state.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "ip6tables.service": {"name": "ip6tables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "iptables.service": {"name": "iptables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "generated", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "loadmodules.service": {"name": "loadmodules.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726883108.30517: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726883106.1992152-34003-166331571650218/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726883108.30528: _low_level_execute_command(): starting 28983 1726883108.30535: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726883106.1992152-34003-166331571650218/ > /dev/null 2>&1 && sleep 0' 28983 1726883108.31001: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883108.31004: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883108.31007: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883108.31009: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883108.31012: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883108.31072: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883108.31076: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883108.31146: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883108.33118: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883108.33161: stderr chunk (state=3): >>><<< 28983 1726883108.33165: stdout chunk (state=3): >>><<< 28983 1726883108.33181: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883108.33188: handler run complete 28983 1726883108.33364: variable 'ansible_facts' from source: unknown 28983 1726883108.33516: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883108.33973: variable 'ansible_facts' from source: unknown 28983 1726883108.34105: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883108.34303: attempt loop complete, returning result 28983 1726883108.34310: _execute() done 28983 1726883108.34317: dumping result to json 28983 1726883108.34364: done dumping result, returning 28983 1726883108.34375: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running [0affe814-3a2d-b16d-c0a7-000000002205] 28983 1726883108.34381: sending task result for task 0affe814-3a2d-b16d-c0a7-000000002205 ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28983 1726883108.35243: no more pending results, returning what we have 28983 1726883108.35246: results queue empty 28983 1726883108.35246: checking for any_errors_fatal 28983 1726883108.35251: done checking for any_errors_fatal 28983 1726883108.35252: checking for max_fail_percentage 28983 1726883108.35259: done checking for max_fail_percentage 28983 1726883108.35260: checking to see if all hosts have failed and the running result is not ok 28983 1726883108.35261: done checking to see if all hosts have failed 28983 1726883108.35262: getting the remaining hosts for this loop 28983 1726883108.35263: done getting the remaining hosts for this loop 28983 1726883108.35266: getting the next task for host managed_node2 28983 1726883108.35275: done getting next task for host managed_node2 28983 1726883108.35278: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 28983 1726883108.35284: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883108.35294: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000002205 28983 1726883108.35297: WORKER PROCESS EXITING 28983 1726883108.35304: getting variables 28983 1726883108.35306: in VariableManager get_vars() 28983 1726883108.35338: Calling all_inventory to load vars for managed_node2 28983 1726883108.35341: Calling groups_inventory to load vars for managed_node2 28983 1726883108.35342: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883108.35349: Calling all_plugins_play to load vars for managed_node2 28983 1726883108.35352: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883108.35354: Calling groups_plugins_play to load vars for managed_node2 28983 1726883108.36707: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883108.38352: done with get_vars() 28983 1726883108.38378: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:45:08 -0400 (0:00:02.232) 0:02:18.382 ****** 28983 1726883108.38466: entering _queue_task() for managed_node2/package_facts 28983 1726883108.38713: worker is 1 (out of 1 available) 28983 1726883108.38728: exiting _queue_task() for managed_node2/package_facts 28983 1726883108.38742: done queuing things up, now waiting for results queue to drain 28983 1726883108.38744: waiting for pending results... 28983 1726883108.38943: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 28983 1726883108.39079: in run() - task 0affe814-3a2d-b16d-c0a7-000000002206 28983 1726883108.39095: variable 'ansible_search_path' from source: unknown 28983 1726883108.39099: variable 'ansible_search_path' from source: unknown 28983 1726883108.39132: calling self._execute() 28983 1726883108.39220: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883108.39228: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883108.39240: variable 'omit' from source: magic vars 28983 1726883108.39577: variable 'ansible_distribution_major_version' from source: facts 28983 1726883108.39587: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883108.39594: variable 'omit' from source: magic vars 28983 1726883108.39668: variable 'omit' from source: magic vars 28983 1726883108.39697: variable 'omit' from source: magic vars 28983 1726883108.39732: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883108.39769: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883108.39788: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883108.39804: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883108.39814: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883108.39847: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883108.39850: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883108.39853: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883108.39938: Set connection var ansible_connection to ssh 28983 1726883108.39948: Set connection var ansible_shell_executable to /bin/sh 28983 1726883108.39957: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883108.39968: Set connection var ansible_timeout to 10 28983 1726883108.39975: Set connection var ansible_pipelining to False 28983 1726883108.39978: Set connection var ansible_shell_type to sh 28983 1726883108.40000: variable 'ansible_shell_executable' from source: unknown 28983 1726883108.40003: variable 'ansible_connection' from source: unknown 28983 1726883108.40006: variable 'ansible_module_compression' from source: unknown 28983 1726883108.40009: variable 'ansible_shell_type' from source: unknown 28983 1726883108.40012: variable 'ansible_shell_executable' from source: unknown 28983 1726883108.40017: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883108.40022: variable 'ansible_pipelining' from source: unknown 28983 1726883108.40025: variable 'ansible_timeout' from source: unknown 28983 1726883108.40030: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883108.40199: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28983 1726883108.40212: variable 'omit' from source: magic vars 28983 1726883108.40215: starting attempt loop 28983 1726883108.40218: running the handler 28983 1726883108.40231: _low_level_execute_command(): starting 28983 1726883108.40240: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726883108.40782: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883108.40787: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883108.40790: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883108.40840: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883108.40844: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883108.40925: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883108.42702: stdout chunk (state=3): >>>/root <<< 28983 1726883108.42817: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883108.42867: stderr chunk (state=3): >>><<< 28983 1726883108.42870: stdout chunk (state=3): >>><<< 28983 1726883108.42896: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883108.42907: _low_level_execute_command(): starting 28983 1726883108.42913: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726883108.4289513-34067-106992734094700 `" && echo ansible-tmp-1726883108.4289513-34067-106992734094700="` echo /root/.ansible/tmp/ansible-tmp-1726883108.4289513-34067-106992734094700 `" ) && sleep 0' 28983 1726883108.43329: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883108.43377: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726883108.43380: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883108.43384: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found <<< 28983 1726883108.43393: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883108.43429: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883108.43437: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883108.43508: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883108.45549: stdout chunk (state=3): >>>ansible-tmp-1726883108.4289513-34067-106992734094700=/root/.ansible/tmp/ansible-tmp-1726883108.4289513-34067-106992734094700 <<< 28983 1726883108.45670: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883108.45713: stderr chunk (state=3): >>><<< 28983 1726883108.45717: stdout chunk (state=3): >>><<< 28983 1726883108.45734: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726883108.4289513-34067-106992734094700=/root/.ansible/tmp/ansible-tmp-1726883108.4289513-34067-106992734094700 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883108.45772: variable 'ansible_module_compression' from source: unknown 28983 1726883108.45815: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 28983 1726883108.45869: variable 'ansible_facts' from source: unknown 28983 1726883108.46010: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726883108.4289513-34067-106992734094700/AnsiballZ_package_facts.py 28983 1726883108.46129: Sending initial data 28983 1726883108.46133: Sent initial data (162 bytes) 28983 1726883108.46584: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883108.46589: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883108.46592: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883108.46594: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883108.46657: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883108.46659: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883108.46728: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883108.48404: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 28983 1726883108.48414: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726883108.48474: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726883108.48546: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmp14v1qfwt /root/.ansible/tmp/ansible-tmp-1726883108.4289513-34067-106992734094700/AnsiballZ_package_facts.py <<< 28983 1726883108.48549: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726883108.4289513-34067-106992734094700/AnsiballZ_package_facts.py" <<< 28983 1726883108.48608: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmp14v1qfwt" to remote "/root/.ansible/tmp/ansible-tmp-1726883108.4289513-34067-106992734094700/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726883108.4289513-34067-106992734094700/AnsiballZ_package_facts.py" <<< 28983 1726883108.50467: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883108.50526: stderr chunk (state=3): >>><<< 28983 1726883108.50530: stdout chunk (state=3): >>><<< 28983 1726883108.50551: done transferring module to remote 28983 1726883108.50559: _low_level_execute_command(): starting 28983 1726883108.50564: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726883108.4289513-34067-106992734094700/ /root/.ansible/tmp/ansible-tmp-1726883108.4289513-34067-106992734094700/AnsiballZ_package_facts.py && sleep 0' 28983 1726883108.50999: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883108.51003: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883108.51005: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883108.51008: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883108.51066: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883108.51069: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883108.51143: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883108.53092: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883108.53134: stderr chunk (state=3): >>><<< 28983 1726883108.53142: stdout chunk (state=3): >>><<< 28983 1726883108.53151: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883108.53155: _low_level_execute_command(): starting 28983 1726883108.53160: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726883108.4289513-34067-106992734094700/AnsiballZ_package_facts.py && sleep 0' 28983 1726883108.53599: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883108.53602: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883108.53605: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration <<< 28983 1726883108.53609: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found <<< 28983 1726883108.53611: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883108.53663: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883108.53668: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883108.53748: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883109.18759: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "9.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chkconfig": [{"name": "chkconfig", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts": [{"name": "initscripts", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "network-scripts": [{"name": "network-scripts", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 28983 1726883109.20543: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. <<< 28983 1726883109.20564: stderr chunk (state=3): >>><<< 28983 1726883109.20577: stdout chunk (state=3): >>><<< 28983 1726883109.20942: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "9.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chkconfig": [{"name": "chkconfig", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts": [{"name": "initscripts", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "network-scripts": [{"name": "network-scripts", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726883109.32070: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726883108.4289513-34067-106992734094700/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726883109.32305: _low_level_execute_command(): starting 28983 1726883109.32328: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726883108.4289513-34067-106992734094700/ > /dev/null 2>&1 && sleep 0' 28983 1726883109.33699: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883109.33703: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883109.33706: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883109.33709: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883109.33711: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found <<< 28983 1726883109.33713: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883109.34139: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883109.35967: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883109.36041: stderr chunk (state=3): >>><<< 28983 1726883109.36044: stdout chunk (state=3): >>><<< 28983 1726883109.36240: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883109.36243: handler run complete 28983 1726883109.39653: variable 'ansible_facts' from source: unknown 28983 1726883109.41325: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883109.48220: variable 'ansible_facts' from source: unknown 28983 1726883109.49291: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883109.51046: attempt loop complete, returning result 28983 1726883109.51069: _execute() done 28983 1726883109.51073: dumping result to json 28983 1726883109.51505: done dumping result, returning 28983 1726883109.51521: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affe814-3a2d-b16d-c0a7-000000002206] 28983 1726883109.51531: sending task result for task 0affe814-3a2d-b16d-c0a7-000000002206 28983 1726883109.55414: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000002206 28983 1726883109.55423: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28983 1726883109.55603: no more pending results, returning what we have 28983 1726883109.55606: results queue empty 28983 1726883109.55607: checking for any_errors_fatal 28983 1726883109.55615: done checking for any_errors_fatal 28983 1726883109.55616: checking for max_fail_percentage 28983 1726883109.55618: done checking for max_fail_percentage 28983 1726883109.55619: checking to see if all hosts have failed and the running result is not ok 28983 1726883109.55620: done checking to see if all hosts have failed 28983 1726883109.55621: getting the remaining hosts for this loop 28983 1726883109.55623: done getting the remaining hosts for this loop 28983 1726883109.55627: getting the next task for host managed_node2 28983 1726883109.55641: done getting next task for host managed_node2 28983 1726883109.55646: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 28983 1726883109.55652: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883109.55666: getting variables 28983 1726883109.55667: in VariableManager get_vars() 28983 1726883109.55709: Calling all_inventory to load vars for managed_node2 28983 1726883109.55712: Calling groups_inventory to load vars for managed_node2 28983 1726883109.55715: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883109.55724: Calling all_plugins_play to load vars for managed_node2 28983 1726883109.55728: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883109.55732: Calling groups_plugins_play to load vars for managed_node2 28983 1726883109.57963: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883109.61517: done with get_vars() 28983 1726883109.61561: done getting variables 28983 1726883109.61828: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:45:09 -0400 (0:00:01.234) 0:02:19.616 ****** 28983 1726883109.61885: entering _queue_task() for managed_node2/debug 28983 1726883109.62725: worker is 1 (out of 1 available) 28983 1726883109.62741: exiting _queue_task() for managed_node2/debug 28983 1726883109.62757: done queuing things up, now waiting for results queue to drain 28983 1726883109.62759: waiting for pending results... 28983 1726883109.63345: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 28983 1726883109.63615: in run() - task 0affe814-3a2d-b16d-c0a7-0000000021a4 28983 1726883109.63631: variable 'ansible_search_path' from source: unknown 28983 1726883109.63637: variable 'ansible_search_path' from source: unknown 28983 1726883109.63680: calling self._execute() 28983 1726883109.63993: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883109.64001: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883109.64015: variable 'omit' from source: magic vars 28983 1726883109.64463: variable 'ansible_distribution_major_version' from source: facts 28983 1726883109.64477: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883109.64486: variable 'omit' from source: magic vars 28983 1726883109.64565: variable 'omit' from source: magic vars 28983 1726883109.64744: variable 'network_provider' from source: set_fact 28983 1726883109.64748: variable 'omit' from source: magic vars 28983 1726883109.64756: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883109.64802: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883109.64825: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883109.64854: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883109.64863: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883109.64900: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883109.64903: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883109.64910: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883109.65026: Set connection var ansible_connection to ssh 28983 1726883109.65074: Set connection var ansible_shell_executable to /bin/sh 28983 1726883109.65078: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883109.65080: Set connection var ansible_timeout to 10 28983 1726883109.65083: Set connection var ansible_pipelining to False 28983 1726883109.65085: Set connection var ansible_shell_type to sh 28983 1726883109.65099: variable 'ansible_shell_executable' from source: unknown 28983 1726883109.65103: variable 'ansible_connection' from source: unknown 28983 1726883109.65106: variable 'ansible_module_compression' from source: unknown 28983 1726883109.65182: variable 'ansible_shell_type' from source: unknown 28983 1726883109.65186: variable 'ansible_shell_executable' from source: unknown 28983 1726883109.65189: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883109.65191: variable 'ansible_pipelining' from source: unknown 28983 1726883109.65194: variable 'ansible_timeout' from source: unknown 28983 1726883109.65196: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883109.65295: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883109.65309: variable 'omit' from source: magic vars 28983 1726883109.65315: starting attempt loop 28983 1726883109.65318: running the handler 28983 1726883109.65376: handler run complete 28983 1726883109.65396: attempt loop complete, returning result 28983 1726883109.65403: _execute() done 28983 1726883109.65406: dumping result to json 28983 1726883109.65409: done dumping result, returning 28983 1726883109.65411: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [0affe814-3a2d-b16d-c0a7-0000000021a4] 28983 1726883109.65509: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000021a4 28983 1726883109.65583: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000021a4 28983 1726883109.65586: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: Using network provider: nm 28983 1726883109.65686: no more pending results, returning what we have 28983 1726883109.65690: results queue empty 28983 1726883109.65690: checking for any_errors_fatal 28983 1726883109.65699: done checking for any_errors_fatal 28983 1726883109.65700: checking for max_fail_percentage 28983 1726883109.65701: done checking for max_fail_percentage 28983 1726883109.65702: checking to see if all hosts have failed and the running result is not ok 28983 1726883109.65703: done checking to see if all hosts have failed 28983 1726883109.65704: getting the remaining hosts for this loop 28983 1726883109.65706: done getting the remaining hosts for this loop 28983 1726883109.65711: getting the next task for host managed_node2 28983 1726883109.65719: done getting next task for host managed_node2 28983 1726883109.65723: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 28983 1726883109.65729: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883109.65744: getting variables 28983 1726883109.65745: in VariableManager get_vars() 28983 1726883109.65792: Calling all_inventory to load vars for managed_node2 28983 1726883109.65795: Calling groups_inventory to load vars for managed_node2 28983 1726883109.65797: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883109.65805: Calling all_plugins_play to load vars for managed_node2 28983 1726883109.65808: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883109.65811: Calling groups_plugins_play to load vars for managed_node2 28983 1726883109.68028: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883109.71426: done with get_vars() 28983 1726883109.71465: done getting variables 28983 1726883109.71545: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:45:09 -0400 (0:00:00.097) 0:02:19.713 ****** 28983 1726883109.71599: entering _queue_task() for managed_node2/fail 28983 1726883109.71994: worker is 1 (out of 1 available) 28983 1726883109.72007: exiting _queue_task() for managed_node2/fail 28983 1726883109.72021: done queuing things up, now waiting for results queue to drain 28983 1726883109.72023: waiting for pending results... 28983 1726883109.72403: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 28983 1726883109.72563: in run() - task 0affe814-3a2d-b16d-c0a7-0000000021a5 28983 1726883109.72588: variable 'ansible_search_path' from source: unknown 28983 1726883109.72607: variable 'ansible_search_path' from source: unknown 28983 1726883109.72717: calling self._execute() 28983 1726883109.72780: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883109.72798: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883109.72817: variable 'omit' from source: magic vars 28983 1726883109.73347: variable 'ansible_distribution_major_version' from source: facts 28983 1726883109.73439: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883109.73546: variable 'network_state' from source: role '' defaults 28983 1726883109.73564: Evaluated conditional (network_state != {}): False 28983 1726883109.73574: when evaluation is False, skipping this task 28983 1726883109.73584: _execute() done 28983 1726883109.73597: dumping result to json 28983 1726883109.73639: done dumping result, returning 28983 1726883109.73643: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affe814-3a2d-b16d-c0a7-0000000021a5] 28983 1726883109.73646: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000021a5 28983 1726883109.73901: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000021a5 28983 1726883109.73905: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28983 1726883109.73967: no more pending results, returning what we have 28983 1726883109.73974: results queue empty 28983 1726883109.73975: checking for any_errors_fatal 28983 1726883109.73986: done checking for any_errors_fatal 28983 1726883109.73987: checking for max_fail_percentage 28983 1726883109.73989: done checking for max_fail_percentage 28983 1726883109.73990: checking to see if all hosts have failed and the running result is not ok 28983 1726883109.73991: done checking to see if all hosts have failed 28983 1726883109.73992: getting the remaining hosts for this loop 28983 1726883109.73994: done getting the remaining hosts for this loop 28983 1726883109.74000: getting the next task for host managed_node2 28983 1726883109.74011: done getting next task for host managed_node2 28983 1726883109.74216: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 28983 1726883109.74223: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883109.74247: getting variables 28983 1726883109.74249: in VariableManager get_vars() 28983 1726883109.74294: Calling all_inventory to load vars for managed_node2 28983 1726883109.74297: Calling groups_inventory to load vars for managed_node2 28983 1726883109.74300: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883109.74309: Calling all_plugins_play to load vars for managed_node2 28983 1726883109.74313: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883109.74316: Calling groups_plugins_play to load vars for managed_node2 28983 1726883109.76584: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883109.80922: done with get_vars() 28983 1726883109.80973: done getting variables 28983 1726883109.81054: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:45:09 -0400 (0:00:00.095) 0:02:19.808 ****** 28983 1726883109.81108: entering _queue_task() for managed_node2/fail 28983 1726883109.81562: worker is 1 (out of 1 available) 28983 1726883109.81649: exiting _queue_task() for managed_node2/fail 28983 1726883109.81664: done queuing things up, now waiting for results queue to drain 28983 1726883109.81667: waiting for pending results... 28983 1726883109.82077: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 28983 1726883109.82416: in run() - task 0affe814-3a2d-b16d-c0a7-0000000021a6 28983 1726883109.82421: variable 'ansible_search_path' from source: unknown 28983 1726883109.82424: variable 'ansible_search_path' from source: unknown 28983 1726883109.82428: calling self._execute() 28983 1726883109.82538: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883109.82556: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883109.82633: variable 'omit' from source: magic vars 28983 1726883109.83059: variable 'ansible_distribution_major_version' from source: facts 28983 1726883109.83091: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883109.83258: variable 'network_state' from source: role '' defaults 28983 1726883109.83286: Evaluated conditional (network_state != {}): False 28983 1726883109.83298: when evaluation is False, skipping this task 28983 1726883109.83311: _execute() done 28983 1726883109.83319: dumping result to json 28983 1726883109.83327: done dumping result, returning 28983 1726883109.83397: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affe814-3a2d-b16d-c0a7-0000000021a6] 28983 1726883109.83402: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000021a6 28983 1726883109.83482: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000021a6 28983 1726883109.83485: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28983 1726883109.83551: no more pending results, returning what we have 28983 1726883109.83555: results queue empty 28983 1726883109.83556: checking for any_errors_fatal 28983 1726883109.83568: done checking for any_errors_fatal 28983 1726883109.83570: checking for max_fail_percentage 28983 1726883109.83575: done checking for max_fail_percentage 28983 1726883109.83576: checking to see if all hosts have failed and the running result is not ok 28983 1726883109.83577: done checking to see if all hosts have failed 28983 1726883109.83578: getting the remaining hosts for this loop 28983 1726883109.83580: done getting the remaining hosts for this loop 28983 1726883109.83586: getting the next task for host managed_node2 28983 1726883109.83597: done getting next task for host managed_node2 28983 1726883109.83601: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 28983 1726883109.83611: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883109.83745: getting variables 28983 1726883109.83748: in VariableManager get_vars() 28983 1726883109.83808: Calling all_inventory to load vars for managed_node2 28983 1726883109.83812: Calling groups_inventory to load vars for managed_node2 28983 1726883109.83815: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883109.83829: Calling all_plugins_play to load vars for managed_node2 28983 1726883109.83938: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883109.83945: Calling groups_plugins_play to load vars for managed_node2 28983 1726883109.86615: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883109.89440: done with get_vars() 28983 1726883109.89478: done getting variables 28983 1726883109.89531: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:45:09 -0400 (0:00:00.084) 0:02:19.893 ****** 28983 1726883109.89579: entering _queue_task() for managed_node2/fail 28983 1726883109.89970: worker is 1 (out of 1 available) 28983 1726883109.89985: exiting _queue_task() for managed_node2/fail 28983 1726883109.89998: done queuing things up, now waiting for results queue to drain 28983 1726883109.89999: waiting for pending results... 28983 1726883109.90456: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 28983 1726883109.90515: in run() - task 0affe814-3a2d-b16d-c0a7-0000000021a7 28983 1726883109.90538: variable 'ansible_search_path' from source: unknown 28983 1726883109.90550: variable 'ansible_search_path' from source: unknown 28983 1726883109.90640: calling self._execute() 28983 1726883109.90714: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883109.90727: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883109.90746: variable 'omit' from source: magic vars 28983 1726883109.91247: variable 'ansible_distribution_major_version' from source: facts 28983 1726883109.91264: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883109.91506: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883109.93830: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883109.93888: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883109.93922: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883109.93955: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883109.93982: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883109.94056: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883109.94092: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883109.94117: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883109.94157: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883109.94170: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883109.94253: variable 'ansible_distribution_major_version' from source: facts 28983 1726883109.94266: Evaluated conditional (ansible_distribution_major_version | int > 9): True 28983 1726883109.94366: variable 'ansible_distribution' from source: facts 28983 1726883109.94370: variable '__network_rh_distros' from source: role '' defaults 28983 1726883109.94386: Evaluated conditional (ansible_distribution in __network_rh_distros): False 28983 1726883109.94389: when evaluation is False, skipping this task 28983 1726883109.94392: _execute() done 28983 1726883109.94394: dumping result to json 28983 1726883109.94397: done dumping result, returning 28983 1726883109.94409: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affe814-3a2d-b16d-c0a7-0000000021a7] 28983 1726883109.94425: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000021a7 skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution in __network_rh_distros", "skip_reason": "Conditional result was False" } 28983 1726883109.94631: no more pending results, returning what we have 28983 1726883109.94637: results queue empty 28983 1726883109.94638: checking for any_errors_fatal 28983 1726883109.94646: done checking for any_errors_fatal 28983 1726883109.94647: checking for max_fail_percentage 28983 1726883109.94649: done checking for max_fail_percentage 28983 1726883109.94651: checking to see if all hosts have failed and the running result is not ok 28983 1726883109.94651: done checking to see if all hosts have failed 28983 1726883109.94652: getting the remaining hosts for this loop 28983 1726883109.94655: done getting the remaining hosts for this loop 28983 1726883109.94660: getting the next task for host managed_node2 28983 1726883109.94672: done getting next task for host managed_node2 28983 1726883109.94677: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 28983 1726883109.94684: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883109.94706: getting variables 28983 1726883109.94708: in VariableManager get_vars() 28983 1726883109.94766: Calling all_inventory to load vars for managed_node2 28983 1726883109.94773: Calling groups_inventory to load vars for managed_node2 28983 1726883109.94776: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883109.94794: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000021a7 28983 1726883109.94797: WORKER PROCESS EXITING 28983 1726883109.94806: Calling all_plugins_play to load vars for managed_node2 28983 1726883109.94833: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883109.94840: Calling groups_plugins_play to load vars for managed_node2 28983 1726883109.96843: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883109.98897: done with get_vars() 28983 1726883109.98932: done getting variables 28983 1726883109.99003: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:45:09 -0400 (0:00:00.094) 0:02:19.988 ****** 28983 1726883109.99050: entering _queue_task() for managed_node2/dnf 28983 1726883109.99415: worker is 1 (out of 1 available) 28983 1726883109.99428: exiting _queue_task() for managed_node2/dnf 28983 1726883109.99572: done queuing things up, now waiting for results queue to drain 28983 1726883109.99575: waiting for pending results... 28983 1726883109.99745: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 28983 1726883110.00040: in run() - task 0affe814-3a2d-b16d-c0a7-0000000021a8 28983 1726883110.00043: variable 'ansible_search_path' from source: unknown 28983 1726883110.00046: variable 'ansible_search_path' from source: unknown 28983 1726883110.00049: calling self._execute() 28983 1726883110.00065: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883110.00080: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883110.00097: variable 'omit' from source: magic vars 28983 1726883110.00582: variable 'ansible_distribution_major_version' from source: facts 28983 1726883110.00602: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883110.00868: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883110.02712: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883110.02770: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883110.02800: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883110.02831: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883110.02857: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883110.02927: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883110.03040: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883110.03044: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883110.03072: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883110.03096: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883110.03237: variable 'ansible_distribution' from source: facts 28983 1726883110.03249: variable 'ansible_distribution_major_version' from source: facts 28983 1726883110.03264: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 28983 1726883110.03411: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883110.03599: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883110.03633: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883110.03672: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883110.03742: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883110.03759: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883110.03815: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883110.03859: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883110.03896: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883110.03952: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883110.03980: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883110.04039: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883110.04081: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883110.04151: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883110.04159: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883110.04176: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883110.04304: variable 'network_connections' from source: include params 28983 1726883110.04315: variable 'interface' from source: play vars 28983 1726883110.04378: variable 'interface' from source: play vars 28983 1726883110.04437: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883110.04586: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883110.04617: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883110.04652: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883110.04692: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883110.04728: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883110.04749: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883110.04779: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883110.04801: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883110.04852: variable '__network_team_connections_defined' from source: role '' defaults 28983 1726883110.05077: variable 'network_connections' from source: include params 28983 1726883110.05086: variable 'interface' from source: play vars 28983 1726883110.05147: variable 'interface' from source: play vars 28983 1726883110.05176: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 28983 1726883110.05180: when evaluation is False, skipping this task 28983 1726883110.05183: _execute() done 28983 1726883110.05185: dumping result to json 28983 1726883110.05190: done dumping result, returning 28983 1726883110.05197: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affe814-3a2d-b16d-c0a7-0000000021a8] 28983 1726883110.05203: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000021a8 28983 1726883110.05305: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000021a8 28983 1726883110.05309: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 28983 1726883110.05368: no more pending results, returning what we have 28983 1726883110.05375: results queue empty 28983 1726883110.05376: checking for any_errors_fatal 28983 1726883110.05386: done checking for any_errors_fatal 28983 1726883110.05387: checking for max_fail_percentage 28983 1726883110.05389: done checking for max_fail_percentage 28983 1726883110.05390: checking to see if all hosts have failed and the running result is not ok 28983 1726883110.05391: done checking to see if all hosts have failed 28983 1726883110.05392: getting the remaining hosts for this loop 28983 1726883110.05395: done getting the remaining hosts for this loop 28983 1726883110.05399: getting the next task for host managed_node2 28983 1726883110.05409: done getting next task for host managed_node2 28983 1726883110.05413: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 28983 1726883110.05419: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883110.05444: getting variables 28983 1726883110.05445: in VariableManager get_vars() 28983 1726883110.05495: Calling all_inventory to load vars for managed_node2 28983 1726883110.05498: Calling groups_inventory to load vars for managed_node2 28983 1726883110.05501: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883110.05510: Calling all_plugins_play to load vars for managed_node2 28983 1726883110.05514: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883110.05517: Calling groups_plugins_play to load vars for managed_node2 28983 1726883110.06958: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883110.09074: done with get_vars() 28983 1726883110.09104: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 28983 1726883110.09169: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:45:10 -0400 (0:00:00.101) 0:02:20.089 ****** 28983 1726883110.09200: entering _queue_task() for managed_node2/yum 28983 1726883110.09477: worker is 1 (out of 1 available) 28983 1726883110.09494: exiting _queue_task() for managed_node2/yum 28983 1726883110.09508: done queuing things up, now waiting for results queue to drain 28983 1726883110.09510: waiting for pending results... 28983 1726883110.09711: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 28983 1726883110.09816: in run() - task 0affe814-3a2d-b16d-c0a7-0000000021a9 28983 1726883110.09828: variable 'ansible_search_path' from source: unknown 28983 1726883110.09831: variable 'ansible_search_path' from source: unknown 28983 1726883110.09873: calling self._execute() 28983 1726883110.09966: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883110.09978: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883110.09987: variable 'omit' from source: magic vars 28983 1726883110.10315: variable 'ansible_distribution_major_version' from source: facts 28983 1726883110.10326: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883110.10481: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883110.12276: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883110.12327: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883110.12361: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883110.12395: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883110.12418: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883110.12490: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883110.12826: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883110.12852: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883110.12886: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883110.12900: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883110.12980: variable 'ansible_distribution_major_version' from source: facts 28983 1726883110.12995: Evaluated conditional (ansible_distribution_major_version | int < 8): False 28983 1726883110.12998: when evaluation is False, skipping this task 28983 1726883110.13003: _execute() done 28983 1726883110.13006: dumping result to json 28983 1726883110.13012: done dumping result, returning 28983 1726883110.13021: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affe814-3a2d-b16d-c0a7-0000000021a9] 28983 1726883110.13024: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000021a9 28983 1726883110.13127: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000021a9 28983 1726883110.13130: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 28983 1726883110.13194: no more pending results, returning what we have 28983 1726883110.13198: results queue empty 28983 1726883110.13199: checking for any_errors_fatal 28983 1726883110.13206: done checking for any_errors_fatal 28983 1726883110.13207: checking for max_fail_percentage 28983 1726883110.13209: done checking for max_fail_percentage 28983 1726883110.13210: checking to see if all hosts have failed and the running result is not ok 28983 1726883110.13211: done checking to see if all hosts have failed 28983 1726883110.13212: getting the remaining hosts for this loop 28983 1726883110.13214: done getting the remaining hosts for this loop 28983 1726883110.13219: getting the next task for host managed_node2 28983 1726883110.13230: done getting next task for host managed_node2 28983 1726883110.13242: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 28983 1726883110.13248: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883110.13274: getting variables 28983 1726883110.13276: in VariableManager get_vars() 28983 1726883110.13323: Calling all_inventory to load vars for managed_node2 28983 1726883110.13327: Calling groups_inventory to load vars for managed_node2 28983 1726883110.13330: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883110.13341: Calling all_plugins_play to load vars for managed_node2 28983 1726883110.13344: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883110.13355: Calling groups_plugins_play to load vars for managed_node2 28983 1726883110.14780: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883110.16383: done with get_vars() 28983 1726883110.16410: done getting variables 28983 1726883110.16461: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:45:10 -0400 (0:00:00.072) 0:02:20.162 ****** 28983 1726883110.16492: entering _queue_task() for managed_node2/fail 28983 1726883110.16760: worker is 1 (out of 1 available) 28983 1726883110.16776: exiting _queue_task() for managed_node2/fail 28983 1726883110.16790: done queuing things up, now waiting for results queue to drain 28983 1726883110.16792: waiting for pending results... 28983 1726883110.17006: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 28983 1726883110.17138: in run() - task 0affe814-3a2d-b16d-c0a7-0000000021aa 28983 1726883110.17153: variable 'ansible_search_path' from source: unknown 28983 1726883110.17156: variable 'ansible_search_path' from source: unknown 28983 1726883110.17192: calling self._execute() 28983 1726883110.17280: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883110.17287: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883110.17298: variable 'omit' from source: magic vars 28983 1726883110.17630: variable 'ansible_distribution_major_version' from source: facts 28983 1726883110.17643: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883110.17746: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883110.17926: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883110.19707: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883110.19765: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883110.19798: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883110.19828: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883110.19852: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883110.19923: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883110.19957: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883110.19986: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883110.20018: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883110.20030: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883110.20074: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883110.20100: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883110.20120: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883110.20153: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883110.20165: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883110.20206: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883110.20226: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883110.20248: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883110.20281: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883110.20294: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883110.20440: variable 'network_connections' from source: include params 28983 1726883110.20450: variable 'interface' from source: play vars 28983 1726883110.20506: variable 'interface' from source: play vars 28983 1726883110.20568: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883110.20702: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883110.20736: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883110.20765: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883110.20794: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883110.20828: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883110.20850: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883110.20875: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883110.20898: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883110.20955: variable '__network_team_connections_defined' from source: role '' defaults 28983 1726883110.21160: variable 'network_connections' from source: include params 28983 1726883110.21165: variable 'interface' from source: play vars 28983 1726883110.21221: variable 'interface' from source: play vars 28983 1726883110.21249: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 28983 1726883110.21253: when evaluation is False, skipping this task 28983 1726883110.21256: _execute() done 28983 1726883110.21259: dumping result to json 28983 1726883110.21264: done dumping result, returning 28983 1726883110.21271: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affe814-3a2d-b16d-c0a7-0000000021aa] 28983 1726883110.21282: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000021aa 28983 1726883110.21379: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000021aa 28983 1726883110.21384: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 28983 1726883110.21444: no more pending results, returning what we have 28983 1726883110.21448: results queue empty 28983 1726883110.21449: checking for any_errors_fatal 28983 1726883110.21458: done checking for any_errors_fatal 28983 1726883110.21459: checking for max_fail_percentage 28983 1726883110.21461: done checking for max_fail_percentage 28983 1726883110.21462: checking to see if all hosts have failed and the running result is not ok 28983 1726883110.21463: done checking to see if all hosts have failed 28983 1726883110.21463: getting the remaining hosts for this loop 28983 1726883110.21466: done getting the remaining hosts for this loop 28983 1726883110.21470: getting the next task for host managed_node2 28983 1726883110.21480: done getting next task for host managed_node2 28983 1726883110.21484: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 28983 1726883110.21490: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883110.21513: getting variables 28983 1726883110.21515: in VariableManager get_vars() 28983 1726883110.21563: Calling all_inventory to load vars for managed_node2 28983 1726883110.21567: Calling groups_inventory to load vars for managed_node2 28983 1726883110.21569: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883110.21579: Calling all_plugins_play to load vars for managed_node2 28983 1726883110.21582: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883110.21586: Calling groups_plugins_play to load vars for managed_node2 28983 1726883110.22858: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883110.24547: done with get_vars() 28983 1726883110.24575: done getting variables 28983 1726883110.24622: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:45:10 -0400 (0:00:00.081) 0:02:20.244 ****** 28983 1726883110.24654: entering _queue_task() for managed_node2/package 28983 1726883110.24907: worker is 1 (out of 1 available) 28983 1726883110.24920: exiting _queue_task() for managed_node2/package 28983 1726883110.24937: done queuing things up, now waiting for results queue to drain 28983 1726883110.24940: waiting for pending results... 28983 1726883110.25142: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 28983 1726883110.25282: in run() - task 0affe814-3a2d-b16d-c0a7-0000000021ab 28983 1726883110.25295: variable 'ansible_search_path' from source: unknown 28983 1726883110.25299: variable 'ansible_search_path' from source: unknown 28983 1726883110.25331: calling self._execute() 28983 1726883110.25418: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883110.25424: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883110.25437: variable 'omit' from source: magic vars 28983 1726883110.25769: variable 'ansible_distribution_major_version' from source: facts 28983 1726883110.25783: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883110.25957: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883110.26181: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883110.26217: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883110.26249: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883110.26306: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883110.26397: variable 'network_packages' from source: role '' defaults 28983 1726883110.26486: variable '__network_provider_setup' from source: role '' defaults 28983 1726883110.26495: variable '__network_service_name_default_nm' from source: role '' defaults 28983 1726883110.26552: variable '__network_service_name_default_nm' from source: role '' defaults 28983 1726883110.26556: variable '__network_packages_default_nm' from source: role '' defaults 28983 1726883110.26611: variable '__network_packages_default_nm' from source: role '' defaults 28983 1726883110.26775: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883110.28350: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883110.28401: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883110.28432: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883110.28465: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883110.28489: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883110.28560: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883110.28586: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883110.28606: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883110.28642: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883110.28657: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883110.28698: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883110.28717: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883110.28741: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883110.28777: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883110.28787: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883110.28977: variable '__network_packages_default_gobject_packages' from source: role '' defaults 28983 1726883110.29069: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883110.29093: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883110.29114: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883110.29147: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883110.29160: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883110.29237: variable 'ansible_python' from source: facts 28983 1726883110.29252: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 28983 1726883110.29317: variable '__network_wpa_supplicant_required' from source: role '' defaults 28983 1726883110.29392: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 28983 1726883110.29504: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883110.29523: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883110.29548: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883110.29582: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883110.29598: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883110.29637: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883110.29662: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883110.29686: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883110.29719: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883110.29731: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883110.29853: variable 'network_connections' from source: include params 28983 1726883110.29860: variable 'interface' from source: play vars 28983 1726883110.29947: variable 'interface' from source: play vars 28983 1726883110.30021: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883110.30048: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883110.30072: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883110.30103: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883110.30147: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883110.30384: variable 'network_connections' from source: include params 28983 1726883110.30388: variable 'interface' from source: play vars 28983 1726883110.30471: variable 'interface' from source: play vars 28983 1726883110.30516: variable '__network_packages_default_wireless' from source: role '' defaults 28983 1726883110.30587: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883110.30842: variable 'network_connections' from source: include params 28983 1726883110.30848: variable 'interface' from source: play vars 28983 1726883110.31139: variable 'interface' from source: play vars 28983 1726883110.31143: variable '__network_packages_default_team' from source: role '' defaults 28983 1726883110.31145: variable '__network_team_connections_defined' from source: role '' defaults 28983 1726883110.31690: variable 'network_connections' from source: include params 28983 1726883110.31704: variable 'interface' from source: play vars 28983 1726883110.31791: variable 'interface' from source: play vars 28983 1726883110.31880: variable '__network_service_name_default_initscripts' from source: role '' defaults 28983 1726883110.31961: variable '__network_service_name_default_initscripts' from source: role '' defaults 28983 1726883110.31980: variable '__network_packages_default_initscripts' from source: role '' defaults 28983 1726883110.32062: variable '__network_packages_default_initscripts' from source: role '' defaults 28983 1726883110.32409: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 28983 1726883110.33107: variable 'network_connections' from source: include params 28983 1726883110.33121: variable 'interface' from source: play vars 28983 1726883110.33213: variable 'interface' from source: play vars 28983 1726883110.33232: variable 'ansible_distribution' from source: facts 28983 1726883110.33252: variable '__network_rh_distros' from source: role '' defaults 28983 1726883110.33267: variable 'ansible_distribution_major_version' from source: facts 28983 1726883110.33308: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 28983 1726883110.33552: variable 'ansible_distribution' from source: facts 28983 1726883110.33563: variable '__network_rh_distros' from source: role '' defaults 28983 1726883110.33581: variable 'ansible_distribution_major_version' from source: facts 28983 1726883110.33595: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 28983 1726883110.33832: variable 'ansible_distribution' from source: facts 28983 1726883110.33853: variable '__network_rh_distros' from source: role '' defaults 28983 1726883110.33867: variable 'ansible_distribution_major_version' from source: facts 28983 1726883110.33916: variable 'network_provider' from source: set_fact 28983 1726883110.33944: variable 'ansible_facts' from source: unknown 28983 1726883110.35196: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 28983 1726883110.35208: when evaluation is False, skipping this task 28983 1726883110.35217: _execute() done 28983 1726883110.35227: dumping result to json 28983 1726883110.35242: done dumping result, returning 28983 1726883110.35258: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [0affe814-3a2d-b16d-c0a7-0000000021ab] 28983 1726883110.35280: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000021ab skipping: [managed_node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 28983 1726883110.35504: no more pending results, returning what we have 28983 1726883110.35509: results queue empty 28983 1726883110.35510: checking for any_errors_fatal 28983 1726883110.35520: done checking for any_errors_fatal 28983 1726883110.35521: checking for max_fail_percentage 28983 1726883110.35524: done checking for max_fail_percentage 28983 1726883110.35525: checking to see if all hosts have failed and the running result is not ok 28983 1726883110.35526: done checking to see if all hosts have failed 28983 1726883110.35527: getting the remaining hosts for this loop 28983 1726883110.35530: done getting the remaining hosts for this loop 28983 1726883110.35537: getting the next task for host managed_node2 28983 1726883110.35549: done getting next task for host managed_node2 28983 1726883110.35555: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 28983 1726883110.35561: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883110.35590: getting variables 28983 1726883110.35592: in VariableManager get_vars() 28983 1726883110.35653: Calling all_inventory to load vars for managed_node2 28983 1726883110.35657: Calling groups_inventory to load vars for managed_node2 28983 1726883110.35664: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000021ab 28983 1726883110.35676: WORKER PROCESS EXITING 28983 1726883110.35672: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883110.35688: Calling all_plugins_play to load vars for managed_node2 28983 1726883110.35692: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883110.35696: Calling groups_plugins_play to load vars for managed_node2 28983 1726883110.37041: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883110.38660: done with get_vars() 28983 1726883110.38689: done getting variables 28983 1726883110.38741: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:45:10 -0400 (0:00:00.141) 0:02:20.385 ****** 28983 1726883110.38773: entering _queue_task() for managed_node2/package 28983 1726883110.39020: worker is 1 (out of 1 available) 28983 1726883110.39036: exiting _queue_task() for managed_node2/package 28983 1726883110.39051: done queuing things up, now waiting for results queue to drain 28983 1726883110.39053: waiting for pending results... 28983 1726883110.39253: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 28983 1726883110.39378: in run() - task 0affe814-3a2d-b16d-c0a7-0000000021ac 28983 1726883110.39392: variable 'ansible_search_path' from source: unknown 28983 1726883110.39398: variable 'ansible_search_path' from source: unknown 28983 1726883110.39431: calling self._execute() 28983 1726883110.39515: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883110.39523: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883110.39533: variable 'omit' from source: magic vars 28983 1726883110.39858: variable 'ansible_distribution_major_version' from source: facts 28983 1726883110.39868: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883110.39974: variable 'network_state' from source: role '' defaults 28983 1726883110.39982: Evaluated conditional (network_state != {}): False 28983 1726883110.39985: when evaluation is False, skipping this task 28983 1726883110.39988: _execute() done 28983 1726883110.39993: dumping result to json 28983 1726883110.39997: done dumping result, returning 28983 1726883110.40005: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affe814-3a2d-b16d-c0a7-0000000021ac] 28983 1726883110.40011: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000021ac 28983 1726883110.40116: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000021ac 28983 1726883110.40119: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28983 1726883110.40199: no more pending results, returning what we have 28983 1726883110.40203: results queue empty 28983 1726883110.40204: checking for any_errors_fatal 28983 1726883110.40210: done checking for any_errors_fatal 28983 1726883110.40210: checking for max_fail_percentage 28983 1726883110.40213: done checking for max_fail_percentage 28983 1726883110.40214: checking to see if all hosts have failed and the running result is not ok 28983 1726883110.40215: done checking to see if all hosts have failed 28983 1726883110.40216: getting the remaining hosts for this loop 28983 1726883110.40217: done getting the remaining hosts for this loop 28983 1726883110.40221: getting the next task for host managed_node2 28983 1726883110.40229: done getting next task for host managed_node2 28983 1726883110.40234: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 28983 1726883110.40240: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883110.40261: getting variables 28983 1726883110.40263: in VariableManager get_vars() 28983 1726883110.40303: Calling all_inventory to load vars for managed_node2 28983 1726883110.40306: Calling groups_inventory to load vars for managed_node2 28983 1726883110.40308: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883110.40316: Calling all_plugins_play to load vars for managed_node2 28983 1726883110.40319: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883110.40323: Calling groups_plugins_play to load vars for managed_node2 28983 1726883110.46078: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883110.47650: done with get_vars() 28983 1726883110.47675: done getting variables 28983 1726883110.47715: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:45:10 -0400 (0:00:00.089) 0:02:20.475 ****** 28983 1726883110.47742: entering _queue_task() for managed_node2/package 28983 1726883110.48016: worker is 1 (out of 1 available) 28983 1726883110.48029: exiting _queue_task() for managed_node2/package 28983 1726883110.48046: done queuing things up, now waiting for results queue to drain 28983 1726883110.48049: waiting for pending results... 28983 1726883110.48258: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 28983 1726883110.48397: in run() - task 0affe814-3a2d-b16d-c0a7-0000000021ad 28983 1726883110.48410: variable 'ansible_search_path' from source: unknown 28983 1726883110.48414: variable 'ansible_search_path' from source: unknown 28983 1726883110.48449: calling self._execute() 28983 1726883110.48539: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883110.48548: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883110.48558: variable 'omit' from source: magic vars 28983 1726883110.48895: variable 'ansible_distribution_major_version' from source: facts 28983 1726883110.48907: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883110.49024: variable 'network_state' from source: role '' defaults 28983 1726883110.49038: Evaluated conditional (network_state != {}): False 28983 1726883110.49042: when evaluation is False, skipping this task 28983 1726883110.49049: _execute() done 28983 1726883110.49052: dumping result to json 28983 1726883110.49055: done dumping result, returning 28983 1726883110.49064: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affe814-3a2d-b16d-c0a7-0000000021ad] 28983 1726883110.49068: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000021ad 28983 1726883110.49174: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000021ad 28983 1726883110.49178: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28983 1726883110.49231: no more pending results, returning what we have 28983 1726883110.49238: results queue empty 28983 1726883110.49239: checking for any_errors_fatal 28983 1726883110.49248: done checking for any_errors_fatal 28983 1726883110.49249: checking for max_fail_percentage 28983 1726883110.49251: done checking for max_fail_percentage 28983 1726883110.49253: checking to see if all hosts have failed and the running result is not ok 28983 1726883110.49254: done checking to see if all hosts have failed 28983 1726883110.49254: getting the remaining hosts for this loop 28983 1726883110.49258: done getting the remaining hosts for this loop 28983 1726883110.49263: getting the next task for host managed_node2 28983 1726883110.49273: done getting next task for host managed_node2 28983 1726883110.49277: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 28983 1726883110.49284: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883110.49305: getting variables 28983 1726883110.49306: in VariableManager get_vars() 28983 1726883110.49355: Calling all_inventory to load vars for managed_node2 28983 1726883110.49358: Calling groups_inventory to load vars for managed_node2 28983 1726883110.49360: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883110.49369: Calling all_plugins_play to load vars for managed_node2 28983 1726883110.49372: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883110.49376: Calling groups_plugins_play to load vars for managed_node2 28983 1726883110.50585: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883110.52180: done with get_vars() 28983 1726883110.52203: done getting variables 28983 1726883110.52251: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:45:10 -0400 (0:00:00.045) 0:02:20.520 ****** 28983 1726883110.52283: entering _queue_task() for managed_node2/service 28983 1726883110.52514: worker is 1 (out of 1 available) 28983 1726883110.52531: exiting _queue_task() for managed_node2/service 28983 1726883110.52547: done queuing things up, now waiting for results queue to drain 28983 1726883110.52549: waiting for pending results... 28983 1726883110.52745: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 28983 1726883110.52875: in run() - task 0affe814-3a2d-b16d-c0a7-0000000021ae 28983 1726883110.52895: variable 'ansible_search_path' from source: unknown 28983 1726883110.52899: variable 'ansible_search_path' from source: unknown 28983 1726883110.52929: calling self._execute() 28983 1726883110.53016: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883110.53023: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883110.53033: variable 'omit' from source: magic vars 28983 1726883110.53354: variable 'ansible_distribution_major_version' from source: facts 28983 1726883110.53365: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883110.53474: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883110.53648: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883110.55696: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883110.55759: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883110.55794: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883110.55824: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883110.55851: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883110.55917: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883110.55943: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883110.55969: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883110.56003: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883110.56016: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883110.56059: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883110.56085: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883110.56105: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883110.56138: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883110.56150: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883110.56191: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883110.56211: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883110.56232: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883110.56267: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883110.56281: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883110.56418: variable 'network_connections' from source: include params 28983 1726883110.56428: variable 'interface' from source: play vars 28983 1726883110.56488: variable 'interface' from source: play vars 28983 1726883110.56553: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883110.56685: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883110.56718: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883110.56758: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883110.56783: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883110.56821: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883110.56845: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883110.56866: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883110.56891: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883110.56943: variable '__network_team_connections_defined' from source: role '' defaults 28983 1726883110.57141: variable 'network_connections' from source: include params 28983 1726883110.57147: variable 'interface' from source: play vars 28983 1726883110.57202: variable 'interface' from source: play vars 28983 1726883110.57228: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 28983 1726883110.57232: when evaluation is False, skipping this task 28983 1726883110.57237: _execute() done 28983 1726883110.57239: dumping result to json 28983 1726883110.57245: done dumping result, returning 28983 1726883110.57253: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affe814-3a2d-b16d-c0a7-0000000021ae] 28983 1726883110.57258: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000021ae 28983 1726883110.57361: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000021ae 28983 1726883110.57375: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 28983 1726883110.57426: no more pending results, returning what we have 28983 1726883110.57429: results queue empty 28983 1726883110.57430: checking for any_errors_fatal 28983 1726883110.57439: done checking for any_errors_fatal 28983 1726883110.57440: checking for max_fail_percentage 28983 1726883110.57442: done checking for max_fail_percentage 28983 1726883110.57443: checking to see if all hosts have failed and the running result is not ok 28983 1726883110.57444: done checking to see if all hosts have failed 28983 1726883110.57445: getting the remaining hosts for this loop 28983 1726883110.57448: done getting the remaining hosts for this loop 28983 1726883110.57452: getting the next task for host managed_node2 28983 1726883110.57460: done getting next task for host managed_node2 28983 1726883110.57464: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 28983 1726883110.57470: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883110.57494: getting variables 28983 1726883110.57495: in VariableManager get_vars() 28983 1726883110.57551: Calling all_inventory to load vars for managed_node2 28983 1726883110.57554: Calling groups_inventory to load vars for managed_node2 28983 1726883110.57556: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883110.57565: Calling all_plugins_play to load vars for managed_node2 28983 1726883110.57568: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883110.57574: Calling groups_plugins_play to load vars for managed_node2 28983 1726883110.58981: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883110.60572: done with get_vars() 28983 1726883110.60597: done getting variables 28983 1726883110.60645: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:45:10 -0400 (0:00:00.083) 0:02:20.604 ****** 28983 1726883110.60673: entering _queue_task() for managed_node2/service 28983 1726883110.60905: worker is 1 (out of 1 available) 28983 1726883110.60918: exiting _queue_task() for managed_node2/service 28983 1726883110.60932: done queuing things up, now waiting for results queue to drain 28983 1726883110.60935: waiting for pending results... 28983 1726883110.61152: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 28983 1726883110.61261: in run() - task 0affe814-3a2d-b16d-c0a7-0000000021af 28983 1726883110.61284: variable 'ansible_search_path' from source: unknown 28983 1726883110.61288: variable 'ansible_search_path' from source: unknown 28983 1726883110.61318: calling self._execute() 28983 1726883110.61409: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883110.61416: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883110.61426: variable 'omit' from source: magic vars 28983 1726883110.61776: variable 'ansible_distribution_major_version' from source: facts 28983 1726883110.61788: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883110.61939: variable 'network_provider' from source: set_fact 28983 1726883110.61944: variable 'network_state' from source: role '' defaults 28983 1726883110.61958: Evaluated conditional (network_provider == "nm" or network_state != {}): True 28983 1726883110.61965: variable 'omit' from source: magic vars 28983 1726883110.62024: variable 'omit' from source: magic vars 28983 1726883110.62051: variable 'network_service_name' from source: role '' defaults 28983 1726883110.62109: variable 'network_service_name' from source: role '' defaults 28983 1726883110.62202: variable '__network_provider_setup' from source: role '' defaults 28983 1726883110.62207: variable '__network_service_name_default_nm' from source: role '' defaults 28983 1726883110.62265: variable '__network_service_name_default_nm' from source: role '' defaults 28983 1726883110.62272: variable '__network_packages_default_nm' from source: role '' defaults 28983 1726883110.62325: variable '__network_packages_default_nm' from source: role '' defaults 28983 1726883110.62527: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883110.64255: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883110.64318: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883110.64353: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883110.64385: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883110.64408: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883110.64474: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883110.64500: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883110.64521: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883110.64560: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883110.64573: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883110.64613: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883110.64632: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883110.64659: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883110.64692: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883110.64705: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883110.64901: variable '__network_packages_default_gobject_packages' from source: role '' defaults 28983 1726883110.64997: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883110.65017: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883110.65038: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883110.65069: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883110.65091: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883110.65159: variable 'ansible_python' from source: facts 28983 1726883110.65173: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 28983 1726883110.65244: variable '__network_wpa_supplicant_required' from source: role '' defaults 28983 1726883110.65312: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 28983 1726883110.65421: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883110.65443: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883110.65463: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883110.65496: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883110.65511: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883110.65554: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883110.65580: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883110.65599: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883110.65634: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883110.65648: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883110.65761: variable 'network_connections' from source: include params 28983 1726883110.65768: variable 'interface' from source: play vars 28983 1726883110.65830: variable 'interface' from source: play vars 28983 1726883110.65919: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883110.66184: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883110.66226: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883110.66264: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883110.66304: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883110.66354: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883110.66382: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883110.66412: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883110.66441: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883110.66485: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883110.66717: variable 'network_connections' from source: include params 28983 1726883110.66729: variable 'interface' from source: play vars 28983 1726883110.66789: variable 'interface' from source: play vars 28983 1726883110.66828: variable '__network_packages_default_wireless' from source: role '' defaults 28983 1726883110.66898: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883110.67141: variable 'network_connections' from source: include params 28983 1726883110.67148: variable 'interface' from source: play vars 28983 1726883110.67209: variable 'interface' from source: play vars 28983 1726883110.67230: variable '__network_packages_default_team' from source: role '' defaults 28983 1726883110.67302: variable '__network_team_connections_defined' from source: role '' defaults 28983 1726883110.67544: variable 'network_connections' from source: include params 28983 1726883110.67547: variable 'interface' from source: play vars 28983 1726883110.67610: variable 'interface' from source: play vars 28983 1726883110.67663: variable '__network_service_name_default_initscripts' from source: role '' defaults 28983 1726883110.67718: variable '__network_service_name_default_initscripts' from source: role '' defaults 28983 1726883110.67725: variable '__network_packages_default_initscripts' from source: role '' defaults 28983 1726883110.67778: variable '__network_packages_default_initscripts' from source: role '' defaults 28983 1726883110.67960: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 28983 1726883110.68384: variable 'network_connections' from source: include params 28983 1726883110.68388: variable 'interface' from source: play vars 28983 1726883110.68439: variable 'interface' from source: play vars 28983 1726883110.68448: variable 'ansible_distribution' from source: facts 28983 1726883110.68454: variable '__network_rh_distros' from source: role '' defaults 28983 1726883110.68462: variable 'ansible_distribution_major_version' from source: facts 28983 1726883110.68487: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 28983 1726883110.68631: variable 'ansible_distribution' from source: facts 28983 1726883110.68637: variable '__network_rh_distros' from source: role '' defaults 28983 1726883110.68643: variable 'ansible_distribution_major_version' from source: facts 28983 1726883110.68650: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 28983 1726883110.68800: variable 'ansible_distribution' from source: facts 28983 1726883110.68804: variable '__network_rh_distros' from source: role '' defaults 28983 1726883110.68806: variable 'ansible_distribution_major_version' from source: facts 28983 1726883110.68836: variable 'network_provider' from source: set_fact 28983 1726883110.68855: variable 'omit' from source: magic vars 28983 1726883110.68881: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883110.68907: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883110.68924: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883110.68941: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883110.68950: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883110.68979: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883110.68983: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883110.68987: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883110.69067: Set connection var ansible_connection to ssh 28983 1726883110.69080: Set connection var ansible_shell_executable to /bin/sh 28983 1726883110.69089: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883110.69097: Set connection var ansible_timeout to 10 28983 1726883110.69103: Set connection var ansible_pipelining to False 28983 1726883110.69108: Set connection var ansible_shell_type to sh 28983 1726883110.69130: variable 'ansible_shell_executable' from source: unknown 28983 1726883110.69134: variable 'ansible_connection' from source: unknown 28983 1726883110.69136: variable 'ansible_module_compression' from source: unknown 28983 1726883110.69144: variable 'ansible_shell_type' from source: unknown 28983 1726883110.69146: variable 'ansible_shell_executable' from source: unknown 28983 1726883110.69149: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883110.69151: variable 'ansible_pipelining' from source: unknown 28983 1726883110.69156: variable 'ansible_timeout' from source: unknown 28983 1726883110.69161: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883110.69245: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883110.69257: variable 'omit' from source: magic vars 28983 1726883110.69263: starting attempt loop 28983 1726883110.69266: running the handler 28983 1726883110.69333: variable 'ansible_facts' from source: unknown 28983 1726883110.70036: _low_level_execute_command(): starting 28983 1726883110.70041: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726883110.70581: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883110.70585: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883110.70588: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883110.70590: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883110.70640: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883110.70648: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883110.70661: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883110.70733: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883110.72506: stdout chunk (state=3): >>>/root <<< 28983 1726883110.72619: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883110.72670: stderr chunk (state=3): >>><<< 28983 1726883110.72676: stdout chunk (state=3): >>><<< 28983 1726883110.72693: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883110.72704: _low_level_execute_command(): starting 28983 1726883110.72710: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726883110.7269282-34147-14046186686437 `" && echo ansible-tmp-1726883110.7269282-34147-14046186686437="` echo /root/.ansible/tmp/ansible-tmp-1726883110.7269282-34147-14046186686437 `" ) && sleep 0' 28983 1726883110.73175: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883110.73178: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883110.73181: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883110.73184: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883110.73233: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883110.73239: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883110.73316: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883110.75327: stdout chunk (state=3): >>>ansible-tmp-1726883110.7269282-34147-14046186686437=/root/.ansible/tmp/ansible-tmp-1726883110.7269282-34147-14046186686437 <<< 28983 1726883110.75447: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883110.75490: stderr chunk (state=3): >>><<< 28983 1726883110.75493: stdout chunk (state=3): >>><<< 28983 1726883110.75506: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726883110.7269282-34147-14046186686437=/root/.ansible/tmp/ansible-tmp-1726883110.7269282-34147-14046186686437 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883110.75532: variable 'ansible_module_compression' from source: unknown 28983 1726883110.75574: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 28983 1726883110.75628: variable 'ansible_facts' from source: unknown 28983 1726883110.75763: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726883110.7269282-34147-14046186686437/AnsiballZ_systemd.py 28983 1726883110.75871: Sending initial data 28983 1726883110.75875: Sent initial data (155 bytes) 28983 1726883110.76301: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883110.76341: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883110.76344: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883110.76349: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883110.76352: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883110.76398: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883110.76401: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883110.76475: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883110.78177: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 28983 1726883110.78181: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726883110.78246: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726883110.78312: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpli1cv5pw /root/.ansible/tmp/ansible-tmp-1726883110.7269282-34147-14046186686437/AnsiballZ_systemd.py <<< 28983 1726883110.78320: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726883110.7269282-34147-14046186686437/AnsiballZ_systemd.py" <<< 28983 1726883110.78386: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpli1cv5pw" to remote "/root/.ansible/tmp/ansible-tmp-1726883110.7269282-34147-14046186686437/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726883110.7269282-34147-14046186686437/AnsiballZ_systemd.py" <<< 28983 1726883110.80244: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883110.80303: stderr chunk (state=3): >>><<< 28983 1726883110.80306: stdout chunk (state=3): >>><<< 28983 1726883110.80324: done transferring module to remote 28983 1726883110.80335: _low_level_execute_command(): starting 28983 1726883110.80346: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726883110.7269282-34147-14046186686437/ /root/.ansible/tmp/ansible-tmp-1726883110.7269282-34147-14046186686437/AnsiballZ_systemd.py && sleep 0' 28983 1726883110.80791: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883110.80794: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883110.80797: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration <<< 28983 1726883110.80799: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883110.80801: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883110.80855: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883110.80858: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883110.80932: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883110.82850: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883110.82898: stderr chunk (state=3): >>><<< 28983 1726883110.82901: stdout chunk (state=3): >>><<< 28983 1726883110.82913: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883110.82916: _low_level_execute_command(): starting 28983 1726883110.82922: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726883110.7269282-34147-14046186686437/AnsiballZ_systemd.py && sleep 0' 28983 1726883110.83354: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883110.83358: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726883110.83361: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration <<< 28983 1726883110.83363: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found <<< 28983 1726883110.83365: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883110.83424: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883110.83427: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883110.83495: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883111.16227: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3307", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ExecMainStartTimestampMonotonic": "237115079", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3307", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4837", "MemoryCurrent": "4460544", "MemoryAvailable": "infinity", "CPUUsageNSec": "1710749000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "<<< 28983 1726883111.16263: stdout chunk (state=3): >>>infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target NetworkManager-wait-online.service cloud-init.service network.target network.service multi-user.target", "After": "systemd-journald.socket dbus-broker.service dbus.socket system.slice cloud-init-local.service sysinit.target network-pre.target basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": <<< 28983 1726883111.16276: stdout chunk (state=3): >>>"loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:40:53 EDT", "StateChangeTimestampMonotonic": "816797668", "InactiveExitTimestamp": "Fri 2024-09-20 21:31:13 EDT", "InactiveExitTimestampMonotonic": "237115438", "ActiveEnterTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ActiveEnterTimestampMonotonic": "237210316", "ActiveExitTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ActiveExitTimestampMonotonic": "237082173", "InactiveEnterTimestamp": "Fri 2024-09-20 21:31:13 EDT", "InactiveEnterTimestampMonotonic": "237109124", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ConditionTimestampMonotonic": "237109810", "AssertTimestamp": "Fri 2024-09-20 21:31:13 EDT", "AssertTimestampMonotonic": "237109813", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ac5f6302898c463683c4bdf580ab7e3e", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 28983 1726883111.18444: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. <<< 28983 1726883111.18447: stdout chunk (state=3): >>><<< 28983 1726883111.18450: stderr chunk (state=3): >>><<< 28983 1726883111.18453: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3307", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ExecMainStartTimestampMonotonic": "237115079", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3307", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4837", "MemoryCurrent": "4460544", "MemoryAvailable": "infinity", "CPUUsageNSec": "1710749000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target NetworkManager-wait-online.service cloud-init.service network.target network.service multi-user.target", "After": "systemd-journald.socket dbus-broker.service dbus.socket system.slice cloud-init-local.service sysinit.target network-pre.target basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:40:53 EDT", "StateChangeTimestampMonotonic": "816797668", "InactiveExitTimestamp": "Fri 2024-09-20 21:31:13 EDT", "InactiveExitTimestampMonotonic": "237115438", "ActiveEnterTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ActiveEnterTimestampMonotonic": "237210316", "ActiveExitTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ActiveExitTimestampMonotonic": "237082173", "InactiveEnterTimestamp": "Fri 2024-09-20 21:31:13 EDT", "InactiveEnterTimestampMonotonic": "237109124", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ConditionTimestampMonotonic": "237109810", "AssertTimestamp": "Fri 2024-09-20 21:31:13 EDT", "AssertTimestampMonotonic": "237109813", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ac5f6302898c463683c4bdf580ab7e3e", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726883111.18737: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726883110.7269282-34147-14046186686437/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726883111.18774: _low_level_execute_command(): starting 28983 1726883111.18840: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726883110.7269282-34147-14046186686437/ > /dev/null 2>&1 && sleep 0' 28983 1726883111.19547: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883111.19617: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883111.19638: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883111.19683: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883111.19791: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883111.21836: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883111.21840: stdout chunk (state=3): >>><<< 28983 1726883111.21842: stderr chunk (state=3): >>><<< 28983 1726883111.21859: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883111.21875: handler run complete 28983 1726883111.22019: attempt loop complete, returning result 28983 1726883111.22022: _execute() done 28983 1726883111.22024: dumping result to json 28983 1726883111.22027: done dumping result, returning 28983 1726883111.22041: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affe814-3a2d-b16d-c0a7-0000000021af] 28983 1726883111.22059: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000021af ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28983 1726883111.22712: no more pending results, returning what we have 28983 1726883111.22717: results queue empty 28983 1726883111.22718: checking for any_errors_fatal 28983 1726883111.22726: done checking for any_errors_fatal 28983 1726883111.22727: checking for max_fail_percentage 28983 1726883111.22730: done checking for max_fail_percentage 28983 1726883111.22731: checking to see if all hosts have failed and the running result is not ok 28983 1726883111.22732: done checking to see if all hosts have failed 28983 1726883111.22733: getting the remaining hosts for this loop 28983 1726883111.22738: done getting the remaining hosts for this loop 28983 1726883111.22743: getting the next task for host managed_node2 28983 1726883111.22758: done getting next task for host managed_node2 28983 1726883111.22763: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 28983 1726883111.22773: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883111.22789: getting variables 28983 1726883111.22791: in VariableManager get_vars() 28983 1726883111.22959: Calling all_inventory to load vars for managed_node2 28983 1726883111.22963: Calling groups_inventory to load vars for managed_node2 28983 1726883111.22973: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883111.22980: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000021af 28983 1726883111.22983: WORKER PROCESS EXITING 28983 1726883111.22992: Calling all_plugins_play to load vars for managed_node2 28983 1726883111.22997: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883111.23001: Calling groups_plugins_play to load vars for managed_node2 28983 1726883111.25883: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883111.29186: done with get_vars() 28983 1726883111.29221: done getting variables 28983 1726883111.29302: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:45:11 -0400 (0:00:00.686) 0:02:21.291 ****** 28983 1726883111.29362: entering _queue_task() for managed_node2/service 28983 1726883111.29751: worker is 1 (out of 1 available) 28983 1726883111.29764: exiting _queue_task() for managed_node2/service 28983 1726883111.29781: done queuing things up, now waiting for results queue to drain 28983 1726883111.29784: waiting for pending results... 28983 1726883111.30126: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 28983 1726883111.30345: in run() - task 0affe814-3a2d-b16d-c0a7-0000000021b0 28983 1726883111.30383: variable 'ansible_search_path' from source: unknown 28983 1726883111.30393: variable 'ansible_search_path' from source: unknown 28983 1726883111.30436: calling self._execute() 28983 1726883111.30573: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883111.30602: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883111.30622: variable 'omit' from source: magic vars 28983 1726883111.31134: variable 'ansible_distribution_major_version' from source: facts 28983 1726883111.31162: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883111.31337: variable 'network_provider' from source: set_fact 28983 1726883111.31364: Evaluated conditional (network_provider == "nm"): True 28983 1726883111.31540: variable '__network_wpa_supplicant_required' from source: role '' defaults 28983 1726883111.31629: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 28983 1726883111.31893: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883111.34740: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883111.34843: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883111.34901: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883111.34931: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883111.34962: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883111.35062: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883111.35096: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883111.35116: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883111.35150: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883111.35163: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883111.35211: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883111.35230: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883111.35265: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883111.35302: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883111.35317: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883111.35354: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883111.35373: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883111.35395: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883111.35430: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883111.35444: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883111.35568: variable 'network_connections' from source: include params 28983 1726883111.35590: variable 'interface' from source: play vars 28983 1726883111.35939: variable 'interface' from source: play vars 28983 1726883111.35944: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883111.36022: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883111.36086: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883111.36136: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883111.36190: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883111.36251: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883111.36299: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883111.36340: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883111.36383: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883111.36457: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883111.36903: variable 'network_connections' from source: include params 28983 1726883111.36917: variable 'interface' from source: play vars 28983 1726883111.37114: variable 'interface' from source: play vars 28983 1726883111.37220: Evaluated conditional (__network_wpa_supplicant_required): False 28983 1726883111.37230: when evaluation is False, skipping this task 28983 1726883111.37240: _execute() done 28983 1726883111.37249: dumping result to json 28983 1726883111.37257: done dumping result, returning 28983 1726883111.37283: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affe814-3a2d-b16d-c0a7-0000000021b0] 28983 1726883111.37494: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000021b0 28983 1726883111.37576: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000021b0 skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 28983 1726883111.37642: no more pending results, returning what we have 28983 1726883111.37647: results queue empty 28983 1726883111.37649: checking for any_errors_fatal 28983 1726883111.37683: done checking for any_errors_fatal 28983 1726883111.37685: checking for max_fail_percentage 28983 1726883111.37687: done checking for max_fail_percentage 28983 1726883111.37689: checking to see if all hosts have failed and the running result is not ok 28983 1726883111.37690: done checking to see if all hosts have failed 28983 1726883111.37691: getting the remaining hosts for this loop 28983 1726883111.37694: done getting the remaining hosts for this loop 28983 1726883111.37700: getting the next task for host managed_node2 28983 1726883111.37711: done getting next task for host managed_node2 28983 1726883111.37716: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 28983 1726883111.37723: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883111.38060: getting variables 28983 1726883111.38063: in VariableManager get_vars() 28983 1726883111.38123: Calling all_inventory to load vars for managed_node2 28983 1726883111.38127: Calling groups_inventory to load vars for managed_node2 28983 1726883111.38130: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883111.38143: Calling all_plugins_play to load vars for managed_node2 28983 1726883111.38158: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883111.38267: Calling groups_plugins_play to load vars for managed_node2 28983 1726883111.39079: WORKER PROCESS EXITING 28983 1726883111.41395: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883111.43148: done with get_vars() 28983 1726883111.43173: done getting variables 28983 1726883111.43220: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:45:11 -0400 (0:00:00.139) 0:02:21.430 ****** 28983 1726883111.43252: entering _queue_task() for managed_node2/service 28983 1726883111.43501: worker is 1 (out of 1 available) 28983 1726883111.43513: exiting _queue_task() for managed_node2/service 28983 1726883111.43525: done queuing things up, now waiting for results queue to drain 28983 1726883111.43527: waiting for pending results... 28983 1726883111.43796: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 28983 1726883111.44003: in run() - task 0affe814-3a2d-b16d-c0a7-0000000021b1 28983 1726883111.44016: variable 'ansible_search_path' from source: unknown 28983 1726883111.44020: variable 'ansible_search_path' from source: unknown 28983 1726883111.44060: calling self._execute() 28983 1726883111.44188: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883111.44195: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883111.44210: variable 'omit' from source: magic vars 28983 1726883111.44705: variable 'ansible_distribution_major_version' from source: facts 28983 1726883111.44717: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883111.44850: variable 'network_provider' from source: set_fact 28983 1726883111.44856: Evaluated conditional (network_provider == "initscripts"): False 28983 1726883111.44859: when evaluation is False, skipping this task 28983 1726883111.44862: _execute() done 28983 1726883111.44867: dumping result to json 28983 1726883111.44872: done dumping result, returning 28983 1726883111.44883: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [0affe814-3a2d-b16d-c0a7-0000000021b1] 28983 1726883111.44888: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000021b1 28983 1726883111.44983: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000021b1 28983 1726883111.44985: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28983 1726883111.45065: no more pending results, returning what we have 28983 1726883111.45069: results queue empty 28983 1726883111.45070: checking for any_errors_fatal 28983 1726883111.45077: done checking for any_errors_fatal 28983 1726883111.45078: checking for max_fail_percentage 28983 1726883111.45080: done checking for max_fail_percentage 28983 1726883111.45081: checking to see if all hosts have failed and the running result is not ok 28983 1726883111.45082: done checking to see if all hosts have failed 28983 1726883111.45083: getting the remaining hosts for this loop 28983 1726883111.45085: done getting the remaining hosts for this loop 28983 1726883111.45090: getting the next task for host managed_node2 28983 1726883111.45097: done getting next task for host managed_node2 28983 1726883111.45102: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 28983 1726883111.45108: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883111.45129: getting variables 28983 1726883111.45131: in VariableManager get_vars() 28983 1726883111.45180: Calling all_inventory to load vars for managed_node2 28983 1726883111.45183: Calling groups_inventory to load vars for managed_node2 28983 1726883111.45186: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883111.45194: Calling all_plugins_play to load vars for managed_node2 28983 1726883111.45197: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883111.45201: Calling groups_plugins_play to load vars for managed_node2 28983 1726883111.46420: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883111.48024: done with get_vars() 28983 1726883111.48047: done getting variables 28983 1726883111.48098: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:45:11 -0400 (0:00:00.048) 0:02:21.479 ****** 28983 1726883111.48127: entering _queue_task() for managed_node2/copy 28983 1726883111.48351: worker is 1 (out of 1 available) 28983 1726883111.48365: exiting _queue_task() for managed_node2/copy 28983 1726883111.48377: done queuing things up, now waiting for results queue to drain 28983 1726883111.48379: waiting for pending results... 28983 1726883111.48570: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 28983 1726883111.48694: in run() - task 0affe814-3a2d-b16d-c0a7-0000000021b2 28983 1726883111.48706: variable 'ansible_search_path' from source: unknown 28983 1726883111.48711: variable 'ansible_search_path' from source: unknown 28983 1726883111.48745: calling self._execute() 28983 1726883111.48828: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883111.48834: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883111.48847: variable 'omit' from source: magic vars 28983 1726883111.49161: variable 'ansible_distribution_major_version' from source: facts 28983 1726883111.49172: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883111.49272: variable 'network_provider' from source: set_fact 28983 1726883111.49282: Evaluated conditional (network_provider == "initscripts"): False 28983 1726883111.49286: when evaluation is False, skipping this task 28983 1726883111.49289: _execute() done 28983 1726883111.49291: dumping result to json 28983 1726883111.49296: done dumping result, returning 28983 1726883111.49305: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affe814-3a2d-b16d-c0a7-0000000021b2] 28983 1726883111.49311: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000021b2 28983 1726883111.49412: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000021b2 28983 1726883111.49415: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 28983 1726883111.49473: no more pending results, returning what we have 28983 1726883111.49477: results queue empty 28983 1726883111.49478: checking for any_errors_fatal 28983 1726883111.49483: done checking for any_errors_fatal 28983 1726883111.49484: checking for max_fail_percentage 28983 1726883111.49486: done checking for max_fail_percentage 28983 1726883111.49487: checking to see if all hosts have failed and the running result is not ok 28983 1726883111.49488: done checking to see if all hosts have failed 28983 1726883111.49489: getting the remaining hosts for this loop 28983 1726883111.49491: done getting the remaining hosts for this loop 28983 1726883111.49495: getting the next task for host managed_node2 28983 1726883111.49503: done getting next task for host managed_node2 28983 1726883111.49507: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 28983 1726883111.49512: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883111.49541: getting variables 28983 1726883111.49543: in VariableManager get_vars() 28983 1726883111.49581: Calling all_inventory to load vars for managed_node2 28983 1726883111.49583: Calling groups_inventory to load vars for managed_node2 28983 1726883111.49585: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883111.49591: Calling all_plugins_play to load vars for managed_node2 28983 1726883111.49593: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883111.49595: Calling groups_plugins_play to load vars for managed_node2 28983 1726883111.50962: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883111.52557: done with get_vars() 28983 1726883111.52584: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:45:11 -0400 (0:00:00.045) 0:02:21.524 ****** 28983 1726883111.52651: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 28983 1726883111.52875: worker is 1 (out of 1 available) 28983 1726883111.52890: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 28983 1726883111.52902: done queuing things up, now waiting for results queue to drain 28983 1726883111.52903: waiting for pending results... 28983 1726883111.53104: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 28983 1726883111.53217: in run() - task 0affe814-3a2d-b16d-c0a7-0000000021b3 28983 1726883111.53230: variable 'ansible_search_path' from source: unknown 28983 1726883111.53235: variable 'ansible_search_path' from source: unknown 28983 1726883111.53274: calling self._execute() 28983 1726883111.53364: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883111.53369: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883111.53381: variable 'omit' from source: magic vars 28983 1726883111.53707: variable 'ansible_distribution_major_version' from source: facts 28983 1726883111.53718: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883111.53725: variable 'omit' from source: magic vars 28983 1726883111.53789: variable 'omit' from source: magic vars 28983 1726883111.53946: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883111.55651: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883111.55711: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883111.55746: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883111.55779: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883111.55802: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883111.55869: variable 'network_provider' from source: set_fact 28983 1726883111.55977: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883111.56001: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883111.56022: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883111.56056: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883111.56069: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883111.56136: variable 'omit' from source: magic vars 28983 1726883111.56231: variable 'omit' from source: magic vars 28983 1726883111.56323: variable 'network_connections' from source: include params 28983 1726883111.56335: variable 'interface' from source: play vars 28983 1726883111.56387: variable 'interface' from source: play vars 28983 1726883111.56519: variable 'omit' from source: magic vars 28983 1726883111.56529: variable '__lsr_ansible_managed' from source: task vars 28983 1726883111.56581: variable '__lsr_ansible_managed' from source: task vars 28983 1726883111.56740: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 28983 1726883111.56915: Loaded config def from plugin (lookup/template) 28983 1726883111.56920: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 28983 1726883111.56947: File lookup term: get_ansible_managed.j2 28983 1726883111.56950: variable 'ansible_search_path' from source: unknown 28983 1726883111.56954: evaluation_path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 28983 1726883111.56973: search_path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 28983 1726883111.56985: variable 'ansible_search_path' from source: unknown 28983 1726883111.62680: variable 'ansible_managed' from source: unknown 28983 1726883111.62812: variable 'omit' from source: magic vars 28983 1726883111.62839: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883111.62860: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883111.62878: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883111.62894: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883111.62905: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883111.62932: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883111.62938: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883111.62941: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883111.63015: Set connection var ansible_connection to ssh 28983 1726883111.63028: Set connection var ansible_shell_executable to /bin/sh 28983 1726883111.63040: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883111.63049: Set connection var ansible_timeout to 10 28983 1726883111.63056: Set connection var ansible_pipelining to False 28983 1726883111.63058: Set connection var ansible_shell_type to sh 28983 1726883111.63080: variable 'ansible_shell_executable' from source: unknown 28983 1726883111.63083: variable 'ansible_connection' from source: unknown 28983 1726883111.63086: variable 'ansible_module_compression' from source: unknown 28983 1726883111.63090: variable 'ansible_shell_type' from source: unknown 28983 1726883111.63094: variable 'ansible_shell_executable' from source: unknown 28983 1726883111.63097: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883111.63102: variable 'ansible_pipelining' from source: unknown 28983 1726883111.63106: variable 'ansible_timeout' from source: unknown 28983 1726883111.63111: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883111.63217: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28983 1726883111.63231: variable 'omit' from source: magic vars 28983 1726883111.63236: starting attempt loop 28983 1726883111.63239: running the handler 28983 1726883111.63255: _low_level_execute_command(): starting 28983 1726883111.63264: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726883111.63806: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883111.63810: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883111.63813: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found <<< 28983 1726883111.63816: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883111.63863: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883111.63866: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883111.63868: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883111.63952: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883111.65738: stdout chunk (state=3): >>>/root <<< 28983 1726883111.65845: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883111.65900: stderr chunk (state=3): >>><<< 28983 1726883111.65903: stdout chunk (state=3): >>><<< 28983 1726883111.65923: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883111.65933: _low_level_execute_command(): starting 28983 1726883111.65941: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726883111.6592295-34174-78807499126496 `" && echo ansible-tmp-1726883111.6592295-34174-78807499126496="` echo /root/.ansible/tmp/ansible-tmp-1726883111.6592295-34174-78807499126496 `" ) && sleep 0' 28983 1726883111.66392: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883111.66397: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883111.66400: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration <<< 28983 1726883111.66402: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883111.66406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883111.66458: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883111.66461: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883111.66544: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883111.68613: stdout chunk (state=3): >>>ansible-tmp-1726883111.6592295-34174-78807499126496=/root/.ansible/tmp/ansible-tmp-1726883111.6592295-34174-78807499126496 <<< 28983 1726883111.68730: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883111.68777: stderr chunk (state=3): >>><<< 28983 1726883111.68781: stdout chunk (state=3): >>><<< 28983 1726883111.68798: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726883111.6592295-34174-78807499126496=/root/.ansible/tmp/ansible-tmp-1726883111.6592295-34174-78807499126496 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883111.68832: variable 'ansible_module_compression' from source: unknown 28983 1726883111.68869: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 28983 1726883111.68899: variable 'ansible_facts' from source: unknown 28983 1726883111.68963: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726883111.6592295-34174-78807499126496/AnsiballZ_network_connections.py 28983 1726883111.69070: Sending initial data 28983 1726883111.69074: Sent initial data (167 bytes) 28983 1726883111.69493: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883111.69527: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883111.69530: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726883111.69539: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883111.69542: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883111.69544: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883111.69596: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883111.69602: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883111.69674: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883111.71341: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 28983 1726883111.71345: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726883111.71408: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726883111.71480: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpb64tt64e /root/.ansible/tmp/ansible-tmp-1726883111.6592295-34174-78807499126496/AnsiballZ_network_connections.py <<< 28983 1726883111.71483: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726883111.6592295-34174-78807499126496/AnsiballZ_network_connections.py" <<< 28983 1726883111.71548: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpb64tt64e" to remote "/root/.ansible/tmp/ansible-tmp-1726883111.6592295-34174-78807499126496/AnsiballZ_network_connections.py" <<< 28983 1726883111.71556: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726883111.6592295-34174-78807499126496/AnsiballZ_network_connections.py" <<< 28983 1726883111.72782: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883111.72847: stderr chunk (state=3): >>><<< 28983 1726883111.72850: stdout chunk (state=3): >>><<< 28983 1726883111.72869: done transferring module to remote 28983 1726883111.72882: _low_level_execute_command(): starting 28983 1726883111.72887: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726883111.6592295-34174-78807499126496/ /root/.ansible/tmp/ansible-tmp-1726883111.6592295-34174-78807499126496/AnsiballZ_network_connections.py && sleep 0' 28983 1726883111.73313: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883111.73350: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883111.73353: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726883111.73356: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 28983 1726883111.73360: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883111.73362: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883111.73413: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883111.73420: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883111.73491: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883111.75410: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883111.75457: stderr chunk (state=3): >>><<< 28983 1726883111.75460: stdout chunk (state=3): >>><<< 28983 1726883111.75477: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883111.75481: _low_level_execute_command(): starting 28983 1726883111.75486: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726883111.6592295-34174-78807499126496/AnsiballZ_network_connections.py && sleep 0' 28983 1726883111.75896: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883111.75929: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883111.75933: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883111.75939: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration <<< 28983 1726883111.75941: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883111.75943: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883111.75996: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883111.76002: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883111.76081: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883112.07073: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, e447ed98-bcde-4ff6-b521-e956422e5e9a\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}}<<< 28983 1726883112.07147: stdout chunk (state=3): >>> <<< 28983 1726883112.10143: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. <<< 28983 1726883112.10206: stderr chunk (state=3): >>><<< 28983 1726883112.10210: stdout chunk (state=3): >>><<< 28983 1726883112.10228: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, e447ed98-bcde-4ff6-b521-e956422e5e9a\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726883112.10271: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'persistent_state': 'present', 'type': 'bridge', 'ip': {'dhcp4': False, 'auto6': False}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726883111.6592295-34174-78807499126496/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726883112.10286: _low_level_execute_command(): starting 28983 1726883112.10290: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726883111.6592295-34174-78807499126496/ > /dev/null 2>&1 && sleep 0' 28983 1726883112.10780: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883112.10783: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726883112.10786: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 28983 1726883112.10788: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883112.10791: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883112.10842: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883112.10849: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883112.10949: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883112.12969: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883112.13022: stderr chunk (state=3): >>><<< 28983 1726883112.13025: stdout chunk (state=3): >>><<< 28983 1726883112.13040: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883112.13047: handler run complete 28983 1726883112.13078: attempt loop complete, returning result 28983 1726883112.13081: _execute() done 28983 1726883112.13087: dumping result to json 28983 1726883112.13096: done dumping result, returning 28983 1726883112.13105: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affe814-3a2d-b16d-c0a7-0000000021b3] 28983 1726883112.13112: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000021b3 28983 1726883112.13228: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000021b3 28983 1726883112.13231: WORKER PROCESS EXITING changed: [managed_node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [002] #0, state:None persistent_state:present, 'statebr': add connection statebr, e447ed98-bcde-4ff6-b521-e956422e5e9a 28983 1726883112.13390: no more pending results, returning what we have 28983 1726883112.13394: results queue empty 28983 1726883112.13395: checking for any_errors_fatal 28983 1726883112.13402: done checking for any_errors_fatal 28983 1726883112.13403: checking for max_fail_percentage 28983 1726883112.13405: done checking for max_fail_percentage 28983 1726883112.13406: checking to see if all hosts have failed and the running result is not ok 28983 1726883112.13407: done checking to see if all hosts have failed 28983 1726883112.13408: getting the remaining hosts for this loop 28983 1726883112.13410: done getting the remaining hosts for this loop 28983 1726883112.13414: getting the next task for host managed_node2 28983 1726883112.13422: done getting next task for host managed_node2 28983 1726883112.13427: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 28983 1726883112.13432: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883112.13448: getting variables 28983 1726883112.13449: in VariableManager get_vars() 28983 1726883112.13499: Calling all_inventory to load vars for managed_node2 28983 1726883112.13502: Calling groups_inventory to load vars for managed_node2 28983 1726883112.13505: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883112.13514: Calling all_plugins_play to load vars for managed_node2 28983 1726883112.13517: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883112.13521: Calling groups_plugins_play to load vars for managed_node2 28983 1726883112.14842: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883112.16592: done with get_vars() 28983 1726883112.16615: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:45:12 -0400 (0:00:00.640) 0:02:22.164 ****** 28983 1726883112.16692: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 28983 1726883112.16950: worker is 1 (out of 1 available) 28983 1726883112.16964: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 28983 1726883112.16981: done queuing things up, now waiting for results queue to drain 28983 1726883112.16983: waiting for pending results... 28983 1726883112.17191: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 28983 1726883112.17315: in run() - task 0affe814-3a2d-b16d-c0a7-0000000021b4 28983 1726883112.17330: variable 'ansible_search_path' from source: unknown 28983 1726883112.17334: variable 'ansible_search_path' from source: unknown 28983 1726883112.17374: calling self._execute() 28983 1726883112.17460: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883112.17467: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883112.17479: variable 'omit' from source: magic vars 28983 1726883112.17813: variable 'ansible_distribution_major_version' from source: facts 28983 1726883112.17824: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883112.17931: variable 'network_state' from source: role '' defaults 28983 1726883112.17943: Evaluated conditional (network_state != {}): False 28983 1726883112.17946: when evaluation is False, skipping this task 28983 1726883112.17949: _execute() done 28983 1726883112.17954: dumping result to json 28983 1726883112.17959: done dumping result, returning 28983 1726883112.17967: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [0affe814-3a2d-b16d-c0a7-0000000021b4] 28983 1726883112.17976: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000021b4 28983 1726883112.18077: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000021b4 28983 1726883112.18081: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28983 1726883112.18155: no more pending results, returning what we have 28983 1726883112.18159: results queue empty 28983 1726883112.18160: checking for any_errors_fatal 28983 1726883112.18173: done checking for any_errors_fatal 28983 1726883112.18174: checking for max_fail_percentage 28983 1726883112.18176: done checking for max_fail_percentage 28983 1726883112.18177: checking to see if all hosts have failed and the running result is not ok 28983 1726883112.18178: done checking to see if all hosts have failed 28983 1726883112.18179: getting the remaining hosts for this loop 28983 1726883112.18180: done getting the remaining hosts for this loop 28983 1726883112.18184: getting the next task for host managed_node2 28983 1726883112.18192: done getting next task for host managed_node2 28983 1726883112.18199: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 28983 1726883112.18204: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883112.18225: getting variables 28983 1726883112.18226: in VariableManager get_vars() 28983 1726883112.18268: Calling all_inventory to load vars for managed_node2 28983 1726883112.18274: Calling groups_inventory to load vars for managed_node2 28983 1726883112.18281: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883112.18288: Calling all_plugins_play to load vars for managed_node2 28983 1726883112.18290: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883112.18292: Calling groups_plugins_play to load vars for managed_node2 28983 1726883112.19509: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883112.21127: done with get_vars() 28983 1726883112.21151: done getting variables 28983 1726883112.21199: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:45:12 -0400 (0:00:00.045) 0:02:22.210 ****** 28983 1726883112.21228: entering _queue_task() for managed_node2/debug 28983 1726883112.21458: worker is 1 (out of 1 available) 28983 1726883112.21474: exiting _queue_task() for managed_node2/debug 28983 1726883112.21488: done queuing things up, now waiting for results queue to drain 28983 1726883112.21490: waiting for pending results... 28983 1726883112.21680: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 28983 1726883112.21793: in run() - task 0affe814-3a2d-b16d-c0a7-0000000021b5 28983 1726883112.21806: variable 'ansible_search_path' from source: unknown 28983 1726883112.21810: variable 'ansible_search_path' from source: unknown 28983 1726883112.21847: calling self._execute() 28983 1726883112.21927: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883112.21932: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883112.21948: variable 'omit' from source: magic vars 28983 1726883112.22268: variable 'ansible_distribution_major_version' from source: facts 28983 1726883112.22283: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883112.22287: variable 'omit' from source: magic vars 28983 1726883112.22342: variable 'omit' from source: magic vars 28983 1726883112.22373: variable 'omit' from source: magic vars 28983 1726883112.22411: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883112.22443: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883112.22461: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883112.22478: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883112.22492: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883112.22521: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883112.22525: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883112.22529: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883112.22612: Set connection var ansible_connection to ssh 28983 1726883112.22622: Set connection var ansible_shell_executable to /bin/sh 28983 1726883112.22630: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883112.22640: Set connection var ansible_timeout to 10 28983 1726883112.22647: Set connection var ansible_pipelining to False 28983 1726883112.22650: Set connection var ansible_shell_type to sh 28983 1726883112.22673: variable 'ansible_shell_executable' from source: unknown 28983 1726883112.22676: variable 'ansible_connection' from source: unknown 28983 1726883112.22680: variable 'ansible_module_compression' from source: unknown 28983 1726883112.22682: variable 'ansible_shell_type' from source: unknown 28983 1726883112.22684: variable 'ansible_shell_executable' from source: unknown 28983 1726883112.22687: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883112.22692: variable 'ansible_pipelining' from source: unknown 28983 1726883112.22695: variable 'ansible_timeout' from source: unknown 28983 1726883112.22701: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883112.22817: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883112.22832: variable 'omit' from source: magic vars 28983 1726883112.22836: starting attempt loop 28983 1726883112.22843: running the handler 28983 1726883112.22955: variable '__network_connections_result' from source: set_fact 28983 1726883112.23002: handler run complete 28983 1726883112.23018: attempt loop complete, returning result 28983 1726883112.23022: _execute() done 28983 1726883112.23025: dumping result to json 28983 1726883112.23030: done dumping result, returning 28983 1726883112.23044: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affe814-3a2d-b16d-c0a7-0000000021b5] 28983 1726883112.23053: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000021b5 28983 1726883112.23147: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000021b5 28983 1726883112.23150: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, e447ed98-bcde-4ff6-b521-e956422e5e9a" ] } 28983 1726883112.23231: no more pending results, returning what we have 28983 1726883112.23237: results queue empty 28983 1726883112.23238: checking for any_errors_fatal 28983 1726883112.23244: done checking for any_errors_fatal 28983 1726883112.23245: checking for max_fail_percentage 28983 1726883112.23247: done checking for max_fail_percentage 28983 1726883112.23248: checking to see if all hosts have failed and the running result is not ok 28983 1726883112.23249: done checking to see if all hosts have failed 28983 1726883112.23250: getting the remaining hosts for this loop 28983 1726883112.23252: done getting the remaining hosts for this loop 28983 1726883112.23256: getting the next task for host managed_node2 28983 1726883112.23267: done getting next task for host managed_node2 28983 1726883112.23272: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 28983 1726883112.23278: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883112.23292: getting variables 28983 1726883112.23293: in VariableManager get_vars() 28983 1726883112.23332: Calling all_inventory to load vars for managed_node2 28983 1726883112.23343: Calling groups_inventory to load vars for managed_node2 28983 1726883112.23346: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883112.23353: Calling all_plugins_play to load vars for managed_node2 28983 1726883112.23356: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883112.23359: Calling groups_plugins_play to load vars for managed_node2 28983 1726883112.24770: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883112.26355: done with get_vars() 28983 1726883112.26377: done getting variables 28983 1726883112.26425: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:45:12 -0400 (0:00:00.052) 0:02:22.262 ****** 28983 1726883112.26458: entering _queue_task() for managed_node2/debug 28983 1726883112.26689: worker is 1 (out of 1 available) 28983 1726883112.26702: exiting _queue_task() for managed_node2/debug 28983 1726883112.26716: done queuing things up, now waiting for results queue to drain 28983 1726883112.26718: waiting for pending results... 28983 1726883112.26921: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 28983 1726883112.27051: in run() - task 0affe814-3a2d-b16d-c0a7-0000000021b6 28983 1726883112.27066: variable 'ansible_search_path' from source: unknown 28983 1726883112.27070: variable 'ansible_search_path' from source: unknown 28983 1726883112.27104: calling self._execute() 28983 1726883112.27190: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883112.27196: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883112.27207: variable 'omit' from source: magic vars 28983 1726883112.27532: variable 'ansible_distribution_major_version' from source: facts 28983 1726883112.27545: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883112.27552: variable 'omit' from source: magic vars 28983 1726883112.27605: variable 'omit' from source: magic vars 28983 1726883112.27642: variable 'omit' from source: magic vars 28983 1726883112.27682: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883112.27718: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883112.27737: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883112.27754: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883112.27764: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883112.27793: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883112.27796: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883112.27800: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883112.27884: Set connection var ansible_connection to ssh 28983 1726883112.27894: Set connection var ansible_shell_executable to /bin/sh 28983 1726883112.27903: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883112.27911: Set connection var ansible_timeout to 10 28983 1726883112.27918: Set connection var ansible_pipelining to False 28983 1726883112.27921: Set connection var ansible_shell_type to sh 28983 1726883112.27946: variable 'ansible_shell_executable' from source: unknown 28983 1726883112.27949: variable 'ansible_connection' from source: unknown 28983 1726883112.27952: variable 'ansible_module_compression' from source: unknown 28983 1726883112.27955: variable 'ansible_shell_type' from source: unknown 28983 1726883112.27960: variable 'ansible_shell_executable' from source: unknown 28983 1726883112.27963: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883112.27968: variable 'ansible_pipelining' from source: unknown 28983 1726883112.27975: variable 'ansible_timeout' from source: unknown 28983 1726883112.27977: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883112.28098: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883112.28108: variable 'omit' from source: magic vars 28983 1726883112.28114: starting attempt loop 28983 1726883112.28117: running the handler 28983 1726883112.28165: variable '__network_connections_result' from source: set_fact 28983 1726883112.28228: variable '__network_connections_result' from source: set_fact 28983 1726883112.28336: handler run complete 28983 1726883112.28361: attempt loop complete, returning result 28983 1726883112.28365: _execute() done 28983 1726883112.28368: dumping result to json 28983 1726883112.28383: done dumping result, returning 28983 1726883112.28387: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affe814-3a2d-b16d-c0a7-0000000021b6] 28983 1726883112.28389: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000021b6 28983 1726883112.28492: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000021b6 28983 1726883112.28496: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, e447ed98-bcde-4ff6-b521-e956422e5e9a\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, e447ed98-bcde-4ff6-b521-e956422e5e9a" ] } } 28983 1726883112.28604: no more pending results, returning what we have 28983 1726883112.28609: results queue empty 28983 1726883112.28610: checking for any_errors_fatal 28983 1726883112.28616: done checking for any_errors_fatal 28983 1726883112.28617: checking for max_fail_percentage 28983 1726883112.28619: done checking for max_fail_percentage 28983 1726883112.28620: checking to see if all hosts have failed and the running result is not ok 28983 1726883112.28621: done checking to see if all hosts have failed 28983 1726883112.28622: getting the remaining hosts for this loop 28983 1726883112.28624: done getting the remaining hosts for this loop 28983 1726883112.28627: getting the next task for host managed_node2 28983 1726883112.28642: done getting next task for host managed_node2 28983 1726883112.28646: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 28983 1726883112.28650: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883112.28664: getting variables 28983 1726883112.28665: in VariableManager get_vars() 28983 1726883112.28705: Calling all_inventory to load vars for managed_node2 28983 1726883112.28707: Calling groups_inventory to load vars for managed_node2 28983 1726883112.28709: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883112.28715: Calling all_plugins_play to load vars for managed_node2 28983 1726883112.28717: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883112.28719: Calling groups_plugins_play to load vars for managed_node2 28983 1726883112.29944: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883112.31664: done with get_vars() 28983 1726883112.31691: done getting variables 28983 1726883112.31736: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:45:12 -0400 (0:00:00.053) 0:02:22.315 ****** 28983 1726883112.31761: entering _queue_task() for managed_node2/debug 28983 1726883112.31985: worker is 1 (out of 1 available) 28983 1726883112.32000: exiting _queue_task() for managed_node2/debug 28983 1726883112.32012: done queuing things up, now waiting for results queue to drain 28983 1726883112.32014: waiting for pending results... 28983 1726883112.32208: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 28983 1726883112.32321: in run() - task 0affe814-3a2d-b16d-c0a7-0000000021b7 28983 1726883112.32337: variable 'ansible_search_path' from source: unknown 28983 1726883112.32341: variable 'ansible_search_path' from source: unknown 28983 1726883112.32377: calling self._execute() 28983 1726883112.32463: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883112.32472: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883112.32480: variable 'omit' from source: magic vars 28983 1726883112.32808: variable 'ansible_distribution_major_version' from source: facts 28983 1726883112.32819: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883112.32925: variable 'network_state' from source: role '' defaults 28983 1726883112.32936: Evaluated conditional (network_state != {}): False 28983 1726883112.32939: when evaluation is False, skipping this task 28983 1726883112.32942: _execute() done 28983 1726883112.32947: dumping result to json 28983 1726883112.32952: done dumping result, returning 28983 1726883112.32961: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affe814-3a2d-b16d-c0a7-0000000021b7] 28983 1726883112.32967: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000021b7 28983 1726883112.33064: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000021b7 28983 1726883112.33067: WORKER PROCESS EXITING skipping: [managed_node2] => { "false_condition": "network_state != {}" } 28983 1726883112.33124: no more pending results, returning what we have 28983 1726883112.33127: results queue empty 28983 1726883112.33128: checking for any_errors_fatal 28983 1726883112.33139: done checking for any_errors_fatal 28983 1726883112.33140: checking for max_fail_percentage 28983 1726883112.33142: done checking for max_fail_percentage 28983 1726883112.33143: checking to see if all hosts have failed and the running result is not ok 28983 1726883112.33144: done checking to see if all hosts have failed 28983 1726883112.33145: getting the remaining hosts for this loop 28983 1726883112.33147: done getting the remaining hosts for this loop 28983 1726883112.33151: getting the next task for host managed_node2 28983 1726883112.33159: done getting next task for host managed_node2 28983 1726883112.33163: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 28983 1726883112.33169: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883112.33192: getting variables 28983 1726883112.33194: in VariableManager get_vars() 28983 1726883112.33241: Calling all_inventory to load vars for managed_node2 28983 1726883112.33244: Calling groups_inventory to load vars for managed_node2 28983 1726883112.33246: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883112.33254: Calling all_plugins_play to load vars for managed_node2 28983 1726883112.33256: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883112.33258: Calling groups_plugins_play to load vars for managed_node2 28983 1726883112.34489: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883112.36104: done with get_vars() 28983 1726883112.36127: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:45:12 -0400 (0:00:00.044) 0:02:22.359 ****** 28983 1726883112.36209: entering _queue_task() for managed_node2/ping 28983 1726883112.36451: worker is 1 (out of 1 available) 28983 1726883112.36467: exiting _queue_task() for managed_node2/ping 28983 1726883112.36481: done queuing things up, now waiting for results queue to drain 28983 1726883112.36483: waiting for pending results... 28983 1726883112.36695: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 28983 1726883112.36797: in run() - task 0affe814-3a2d-b16d-c0a7-0000000021b8 28983 1726883112.36812: variable 'ansible_search_path' from source: unknown 28983 1726883112.36817: variable 'ansible_search_path' from source: unknown 28983 1726883112.36854: calling self._execute() 28983 1726883112.36948: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883112.36954: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883112.36964: variable 'omit' from source: magic vars 28983 1726883112.37301: variable 'ansible_distribution_major_version' from source: facts 28983 1726883112.37312: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883112.37319: variable 'omit' from source: magic vars 28983 1726883112.37380: variable 'omit' from source: magic vars 28983 1726883112.37407: variable 'omit' from source: magic vars 28983 1726883112.37445: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883112.37482: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883112.37499: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883112.37515: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883112.37526: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883112.37556: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883112.37559: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883112.37564: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883112.37649: Set connection var ansible_connection to ssh 28983 1726883112.37660: Set connection var ansible_shell_executable to /bin/sh 28983 1726883112.37668: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883112.37680: Set connection var ansible_timeout to 10 28983 1726883112.37686: Set connection var ansible_pipelining to False 28983 1726883112.37689: Set connection var ansible_shell_type to sh 28983 1726883112.37714: variable 'ansible_shell_executable' from source: unknown 28983 1726883112.37718: variable 'ansible_connection' from source: unknown 28983 1726883112.37721: variable 'ansible_module_compression' from source: unknown 28983 1726883112.37723: variable 'ansible_shell_type' from source: unknown 28983 1726883112.37728: variable 'ansible_shell_executable' from source: unknown 28983 1726883112.37731: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883112.37739: variable 'ansible_pipelining' from source: unknown 28983 1726883112.37742: variable 'ansible_timeout' from source: unknown 28983 1726883112.37748: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883112.37918: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28983 1726883112.37928: variable 'omit' from source: magic vars 28983 1726883112.37935: starting attempt loop 28983 1726883112.37940: running the handler 28983 1726883112.37952: _low_level_execute_command(): starting 28983 1726883112.37960: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726883112.38513: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883112.38518: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883112.38522: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883112.38579: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883112.38587: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883112.38662: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883112.40472: stdout chunk (state=3): >>>/root <<< 28983 1726883112.40674: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883112.40678: stdout chunk (state=3): >>><<< 28983 1726883112.40681: stderr chunk (state=3): >>><<< 28983 1726883112.40711: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883112.40740: _low_level_execute_command(): starting 28983 1726883112.40829: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726883112.4071915-34186-48357514283965 `" && echo ansible-tmp-1726883112.4071915-34186-48357514283965="` echo /root/.ansible/tmp/ansible-tmp-1726883112.4071915-34186-48357514283965 `" ) && sleep 0' 28983 1726883112.41412: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883112.41452: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883112.41474: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 28983 1726883112.41519: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883112.41642: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883112.41645: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883112.41828: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883112.43780: stdout chunk (state=3): >>>ansible-tmp-1726883112.4071915-34186-48357514283965=/root/.ansible/tmp/ansible-tmp-1726883112.4071915-34186-48357514283965 <<< 28983 1726883112.43898: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883112.43939: stderr chunk (state=3): >>><<< 28983 1726883112.43942: stdout chunk (state=3): >>><<< 28983 1726883112.43964: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726883112.4071915-34186-48357514283965=/root/.ansible/tmp/ansible-tmp-1726883112.4071915-34186-48357514283965 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883112.43999: variable 'ansible_module_compression' from source: unknown 28983 1726883112.44029: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 28983 1726883112.44057: variable 'ansible_facts' from source: unknown 28983 1726883112.44113: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726883112.4071915-34186-48357514283965/AnsiballZ_ping.py 28983 1726883112.44219: Sending initial data 28983 1726883112.44222: Sent initial data (152 bytes) 28983 1726883112.44628: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883112.44664: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883112.44667: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883112.44673: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883112.44676: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883112.44729: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883112.44736: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883112.44799: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883112.46424: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 28983 1726883112.46429: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726883112.46492: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726883112.46566: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpsgvvbcsr /root/.ansible/tmp/ansible-tmp-1726883112.4071915-34186-48357514283965/AnsiballZ_ping.py <<< 28983 1726883112.46573: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726883112.4071915-34186-48357514283965/AnsiballZ_ping.py" <<< 28983 1726883112.46625: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpsgvvbcsr" to remote "/root/.ansible/tmp/ansible-tmp-1726883112.4071915-34186-48357514283965/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726883112.4071915-34186-48357514283965/AnsiballZ_ping.py" <<< 28983 1726883112.47493: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883112.47549: stderr chunk (state=3): >>><<< 28983 1726883112.47554: stdout chunk (state=3): >>><<< 28983 1726883112.47573: done transferring module to remote 28983 1726883112.47583: _low_level_execute_command(): starting 28983 1726883112.47588: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726883112.4071915-34186-48357514283965/ /root/.ansible/tmp/ansible-tmp-1726883112.4071915-34186-48357514283965/AnsiballZ_ping.py && sleep 0' 28983 1726883112.48019: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883112.48023: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883112.48026: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found <<< 28983 1726883112.48029: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883112.48081: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883112.48084: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883112.48150: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883112.50023: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883112.50075: stderr chunk (state=3): >>><<< 28983 1726883112.50080: stdout chunk (state=3): >>><<< 28983 1726883112.50090: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883112.50093: _low_level_execute_command(): starting 28983 1726883112.50098: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726883112.4071915-34186-48357514283965/AnsiballZ_ping.py && sleep 0' 28983 1726883112.50523: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883112.50526: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883112.50529: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883112.50531: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883112.50579: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883112.50583: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883112.50664: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883112.67727: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 28983 1726883112.69337: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. <<< 28983 1726883112.69351: stderr chunk (state=3): >>><<< 28983 1726883112.69360: stdout chunk (state=3): >>><<< 28983 1726883112.69389: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726883112.69422: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726883112.4071915-34186-48357514283965/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726883112.69523: _low_level_execute_command(): starting 28983 1726883112.69527: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726883112.4071915-34186-48357514283965/ > /dev/null 2>&1 && sleep 0' 28983 1726883112.70056: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883112.70070: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883112.70153: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883112.70205: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883112.70221: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883112.70257: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883112.70350: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883112.72369: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883112.72382: stdout chunk (state=3): >>><<< 28983 1726883112.72399: stderr chunk (state=3): >>><<< 28983 1726883112.72418: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883112.72639: handler run complete 28983 1726883112.72643: attempt loop complete, returning result 28983 1726883112.72645: _execute() done 28983 1726883112.72647: dumping result to json 28983 1726883112.72650: done dumping result, returning 28983 1726883112.72652: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affe814-3a2d-b16d-c0a7-0000000021b8] 28983 1726883112.72654: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000021b8 28983 1726883112.72732: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000021b8 28983 1726883112.72737: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "ping": "pong" } 28983 1726883112.72822: no more pending results, returning what we have 28983 1726883112.72826: results queue empty 28983 1726883112.72827: checking for any_errors_fatal 28983 1726883112.72839: done checking for any_errors_fatal 28983 1726883112.72840: checking for max_fail_percentage 28983 1726883112.72842: done checking for max_fail_percentage 28983 1726883112.72844: checking to see if all hosts have failed and the running result is not ok 28983 1726883112.72845: done checking to see if all hosts have failed 28983 1726883112.72847: getting the remaining hosts for this loop 28983 1726883112.72850: done getting the remaining hosts for this loop 28983 1726883112.72855: getting the next task for host managed_node2 28983 1726883112.72869: done getting next task for host managed_node2 28983 1726883112.72874: ^ task is: TASK: meta (role_complete) 28983 1726883112.72882: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883112.72898: getting variables 28983 1726883112.72899: in VariableManager get_vars() 28983 1726883112.73085: Calling all_inventory to load vars for managed_node2 28983 1726883112.73089: Calling groups_inventory to load vars for managed_node2 28983 1726883112.73092: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883112.73102: Calling all_plugins_play to load vars for managed_node2 28983 1726883112.73107: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883112.73111: Calling groups_plugins_play to load vars for managed_node2 28983 1726883112.75995: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883112.79177: done with get_vars() 28983 1726883112.79217: done getting variables 28983 1726883112.79323: done queuing things up, now waiting for results queue to drain 28983 1726883112.79325: results queue empty 28983 1726883112.79326: checking for any_errors_fatal 28983 1726883112.79330: done checking for any_errors_fatal 28983 1726883112.79331: checking for max_fail_percentage 28983 1726883112.79332: done checking for max_fail_percentage 28983 1726883112.79333: checking to see if all hosts have failed and the running result is not ok 28983 1726883112.79337: done checking to see if all hosts have failed 28983 1726883112.79338: getting the remaining hosts for this loop 28983 1726883112.79339: done getting the remaining hosts for this loop 28983 1726883112.79342: getting the next task for host managed_node2 28983 1726883112.79348: done getting next task for host managed_node2 28983 1726883112.79351: ^ task is: TASK: Show result 28983 1726883112.79354: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883112.79358: getting variables 28983 1726883112.79359: in VariableManager get_vars() 28983 1726883112.79376: Calling all_inventory to load vars for managed_node2 28983 1726883112.79379: Calling groups_inventory to load vars for managed_node2 28983 1726883112.79383: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883112.79393: Calling all_plugins_play to load vars for managed_node2 28983 1726883112.79397: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883112.79401: Calling groups_plugins_play to load vars for managed_node2 28983 1726883112.81494: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883112.84796: done with get_vars() 28983 1726883112.84836: done getting variables 28983 1726883112.84896: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show result] ************************************************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml:14 Friday 20 September 2024 21:45:12 -0400 (0:00:00.487) 0:02:22.847 ****** 28983 1726883112.84941: entering _queue_task() for managed_node2/debug 28983 1726883112.85466: worker is 1 (out of 1 available) 28983 1726883112.85547: exiting _queue_task() for managed_node2/debug 28983 1726883112.85562: done queuing things up, now waiting for results queue to drain 28983 1726883112.85564: waiting for pending results... 28983 1726883112.86054: running TaskExecutor() for managed_node2/TASK: Show result 28983 1726883112.86180: in run() - task 0affe814-3a2d-b16d-c0a7-00000000213a 28983 1726883112.86185: variable 'ansible_search_path' from source: unknown 28983 1726883112.86188: variable 'ansible_search_path' from source: unknown 28983 1726883112.86191: calling self._execute() 28983 1726883112.86259: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883112.86265: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883112.86284: variable 'omit' from source: magic vars 28983 1726883112.86725: variable 'ansible_distribution_major_version' from source: facts 28983 1726883112.86740: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883112.86748: variable 'omit' from source: magic vars 28983 1726883112.86809: variable 'omit' from source: magic vars 28983 1726883112.86850: variable 'omit' from source: magic vars 28983 1726883112.86896: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883112.86938: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883112.86962: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883112.86983: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883112.86996: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883112.87033: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883112.87038: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883112.87045: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883112.87237: Set connection var ansible_connection to ssh 28983 1726883112.87242: Set connection var ansible_shell_executable to /bin/sh 28983 1726883112.87245: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883112.87248: Set connection var ansible_timeout to 10 28983 1726883112.87251: Set connection var ansible_pipelining to False 28983 1726883112.87253: Set connection var ansible_shell_type to sh 28983 1726883112.87255: variable 'ansible_shell_executable' from source: unknown 28983 1726883112.87259: variable 'ansible_connection' from source: unknown 28983 1726883112.87262: variable 'ansible_module_compression' from source: unknown 28983 1726883112.87264: variable 'ansible_shell_type' from source: unknown 28983 1726883112.87266: variable 'ansible_shell_executable' from source: unknown 28983 1726883112.87269: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883112.87274: variable 'ansible_pipelining' from source: unknown 28983 1726883112.87276: variable 'ansible_timeout' from source: unknown 28983 1726883112.87278: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883112.87505: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883112.87510: variable 'omit' from source: magic vars 28983 1726883112.87512: starting attempt loop 28983 1726883112.87515: running the handler 28983 1726883112.87517: variable '__network_connections_result' from source: set_fact 28983 1726883112.87589: variable '__network_connections_result' from source: set_fact 28983 1726883112.87743: handler run complete 28983 1726883112.87781: attempt loop complete, returning result 28983 1726883112.87785: _execute() done 28983 1726883112.87787: dumping result to json 28983 1726883112.87831: done dumping result, returning 28983 1726883112.87836: done running TaskExecutor() for managed_node2/TASK: Show result [0affe814-3a2d-b16d-c0a7-00000000213a] 28983 1726883112.87839: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000213a 28983 1726883112.87917: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000213a 28983 1726883112.87920: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, e447ed98-bcde-4ff6-b521-e956422e5e9a\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, e447ed98-bcde-4ff6-b521-e956422e5e9a" ] } } 28983 1726883112.88025: no more pending results, returning what we have 28983 1726883112.88029: results queue empty 28983 1726883112.88030: checking for any_errors_fatal 28983 1726883112.88032: done checking for any_errors_fatal 28983 1726883112.88033: checking for max_fail_percentage 28983 1726883112.88037: done checking for max_fail_percentage 28983 1726883112.88038: checking to see if all hosts have failed and the running result is not ok 28983 1726883112.88039: done checking to see if all hosts have failed 28983 1726883112.88039: getting the remaining hosts for this loop 28983 1726883112.88041: done getting the remaining hosts for this loop 28983 1726883112.88045: getting the next task for host managed_node2 28983 1726883112.88056: done getting next task for host managed_node2 28983 1726883112.88060: ^ task is: TASK: Include network role 28983 1726883112.88065: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883112.88069: getting variables 28983 1726883112.88073: in VariableManager get_vars() 28983 1726883112.88112: Calling all_inventory to load vars for managed_node2 28983 1726883112.88115: Calling groups_inventory to load vars for managed_node2 28983 1726883112.88118: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883112.88127: Calling all_plugins_play to load vars for managed_node2 28983 1726883112.88130: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883112.88185: Calling groups_plugins_play to load vars for managed_node2 28983 1726883112.90342: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883112.93256: done with get_vars() 28983 1726883112.93293: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml:3 Friday 20 September 2024 21:45:12 -0400 (0:00:00.084) 0:02:22.931 ****** 28983 1726883112.93409: entering _queue_task() for managed_node2/include_role 28983 1726883112.93850: worker is 1 (out of 1 available) 28983 1726883112.93864: exiting _queue_task() for managed_node2/include_role 28983 1726883112.93876: done queuing things up, now waiting for results queue to drain 28983 1726883112.93879: waiting for pending results... 28983 1726883112.94195: running TaskExecutor() for managed_node2/TASK: Include network role 28983 1726883112.94287: in run() - task 0affe814-3a2d-b16d-c0a7-00000000213e 28983 1726883112.94292: variable 'ansible_search_path' from source: unknown 28983 1726883112.94295: variable 'ansible_search_path' from source: unknown 28983 1726883112.94361: calling self._execute() 28983 1726883112.94514: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883112.94518: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883112.94521: variable 'omit' from source: magic vars 28983 1726883112.94933: variable 'ansible_distribution_major_version' from source: facts 28983 1726883112.94949: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883112.95040: _execute() done 28983 1726883112.95044: dumping result to json 28983 1726883112.95048: done dumping result, returning 28983 1726883112.95051: done running TaskExecutor() for managed_node2/TASK: Include network role [0affe814-3a2d-b16d-c0a7-00000000213e] 28983 1726883112.95053: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000213e 28983 1726883112.95152: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000213e 28983 1726883112.95156: WORKER PROCESS EXITING 28983 1726883112.95189: no more pending results, returning what we have 28983 1726883112.95194: in VariableManager get_vars() 28983 1726883112.95249: Calling all_inventory to load vars for managed_node2 28983 1726883112.95253: Calling groups_inventory to load vars for managed_node2 28983 1726883112.95259: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883112.95271: Calling all_plugins_play to load vars for managed_node2 28983 1726883112.95275: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883112.95281: Calling groups_plugins_play to load vars for managed_node2 28983 1726883112.97750: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883113.00613: done with get_vars() 28983 1726883113.00648: variable 'ansible_search_path' from source: unknown 28983 1726883113.00649: variable 'ansible_search_path' from source: unknown 28983 1726883113.00830: variable 'omit' from source: magic vars 28983 1726883113.00888: variable 'omit' from source: magic vars 28983 1726883113.00909: variable 'omit' from source: magic vars 28983 1726883113.00913: we have included files to process 28983 1726883113.00914: generating all_blocks data 28983 1726883113.00917: done generating all_blocks data 28983 1726883113.00922: processing included file: fedora.linux_system_roles.network 28983 1726883113.00953: in VariableManager get_vars() 28983 1726883113.00971: done with get_vars() 28983 1726883113.01005: in VariableManager get_vars() 28983 1726883113.01028: done with get_vars() 28983 1726883113.01075: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 28983 1726883113.01252: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 28983 1726883113.01367: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 28983 1726883113.01997: in VariableManager get_vars() 28983 1726883113.02024: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 28983 1726883113.04554: iterating over new_blocks loaded from include file 28983 1726883113.04556: in VariableManager get_vars() 28983 1726883113.04578: done with get_vars() 28983 1726883113.04580: filtering new block on tags 28983 1726883113.04988: done filtering new block on tags 28983 1726883113.04992: in VariableManager get_vars() 28983 1726883113.05012: done with get_vars() 28983 1726883113.05014: filtering new block on tags 28983 1726883113.05039: done filtering new block on tags 28983 1726883113.05041: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node2 28983 1726883113.05048: extending task lists for all hosts with included blocks 28983 1726883113.05196: done extending task lists 28983 1726883113.05198: done processing included files 28983 1726883113.05199: results queue empty 28983 1726883113.05200: checking for any_errors_fatal 28983 1726883113.05205: done checking for any_errors_fatal 28983 1726883113.05207: checking for max_fail_percentage 28983 1726883113.05208: done checking for max_fail_percentage 28983 1726883113.05209: checking to see if all hosts have failed and the running result is not ok 28983 1726883113.05210: done checking to see if all hosts have failed 28983 1726883113.05211: getting the remaining hosts for this loop 28983 1726883113.05213: done getting the remaining hosts for this loop 28983 1726883113.05216: getting the next task for host managed_node2 28983 1726883113.05221: done getting next task for host managed_node2 28983 1726883113.05225: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 28983 1726883113.05229: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883113.05243: getting variables 28983 1726883113.05244: in VariableManager get_vars() 28983 1726883113.05262: Calling all_inventory to load vars for managed_node2 28983 1726883113.05265: Calling groups_inventory to load vars for managed_node2 28983 1726883113.05268: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883113.05273: Calling all_plugins_play to load vars for managed_node2 28983 1726883113.05277: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883113.05282: Calling groups_plugins_play to load vars for managed_node2 28983 1726883113.12444: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883113.14615: done with get_vars() 28983 1726883113.14642: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:45:13 -0400 (0:00:00.213) 0:02:23.144 ****** 28983 1726883113.14710: entering _queue_task() for managed_node2/include_tasks 28983 1726883113.14998: worker is 1 (out of 1 available) 28983 1726883113.15013: exiting _queue_task() for managed_node2/include_tasks 28983 1726883113.15028: done queuing things up, now waiting for results queue to drain 28983 1726883113.15031: waiting for pending results... 28983 1726883113.15247: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 28983 1726883113.15383: in run() - task 0affe814-3a2d-b16d-c0a7-000000002328 28983 1726883113.15397: variable 'ansible_search_path' from source: unknown 28983 1726883113.15402: variable 'ansible_search_path' from source: unknown 28983 1726883113.15436: calling self._execute() 28983 1726883113.15531: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883113.15538: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883113.15548: variable 'omit' from source: magic vars 28983 1726883113.15895: variable 'ansible_distribution_major_version' from source: facts 28983 1726883113.15906: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883113.15919: _execute() done 28983 1726883113.15923: dumping result to json 28983 1726883113.15927: done dumping result, returning 28983 1726883113.15931: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affe814-3a2d-b16d-c0a7-000000002328] 28983 1726883113.15941: sending task result for task 0affe814-3a2d-b16d-c0a7-000000002328 28983 1726883113.16040: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000002328 28983 1726883113.16051: WORKER PROCESS EXITING 28983 1726883113.16114: no more pending results, returning what we have 28983 1726883113.16119: in VariableManager get_vars() 28983 1726883113.16197: Calling all_inventory to load vars for managed_node2 28983 1726883113.16201: Calling groups_inventory to load vars for managed_node2 28983 1726883113.16205: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883113.16216: Calling all_plugins_play to load vars for managed_node2 28983 1726883113.16221: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883113.16225: Calling groups_plugins_play to load vars for managed_node2 28983 1726883113.18194: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883113.20168: done with get_vars() 28983 1726883113.20202: variable 'ansible_search_path' from source: unknown 28983 1726883113.20204: variable 'ansible_search_path' from source: unknown 28983 1726883113.20258: we have included files to process 28983 1726883113.20260: generating all_blocks data 28983 1726883113.20262: done generating all_blocks data 28983 1726883113.20268: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 28983 1726883113.20269: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 28983 1726883113.20274: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 28983 1726883113.21004: done processing included file 28983 1726883113.21006: iterating over new_blocks loaded from include file 28983 1726883113.21008: in VariableManager get_vars() 28983 1726883113.21031: done with get_vars() 28983 1726883113.21032: filtering new block on tags 28983 1726883113.21059: done filtering new block on tags 28983 1726883113.21061: in VariableManager get_vars() 28983 1726883113.21092: done with get_vars() 28983 1726883113.21094: filtering new block on tags 28983 1726883113.21139: done filtering new block on tags 28983 1726883113.21143: in VariableManager get_vars() 28983 1726883113.21172: done with get_vars() 28983 1726883113.21174: filtering new block on tags 28983 1726883113.21223: done filtering new block on tags 28983 1726883113.21226: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node2 28983 1726883113.21232: extending task lists for all hosts with included blocks 28983 1726883113.23104: done extending task lists 28983 1726883113.23105: done processing included files 28983 1726883113.23106: results queue empty 28983 1726883113.23106: checking for any_errors_fatal 28983 1726883113.23109: done checking for any_errors_fatal 28983 1726883113.23110: checking for max_fail_percentage 28983 1726883113.23111: done checking for max_fail_percentage 28983 1726883113.23112: checking to see if all hosts have failed and the running result is not ok 28983 1726883113.23112: done checking to see if all hosts have failed 28983 1726883113.23113: getting the remaining hosts for this loop 28983 1726883113.23114: done getting the remaining hosts for this loop 28983 1726883113.23116: getting the next task for host managed_node2 28983 1726883113.23120: done getting next task for host managed_node2 28983 1726883113.23122: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 28983 1726883113.23126: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883113.23136: getting variables 28983 1726883113.23136: in VariableManager get_vars() 28983 1726883113.23148: Calling all_inventory to load vars for managed_node2 28983 1726883113.23149: Calling groups_inventory to load vars for managed_node2 28983 1726883113.23151: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883113.23155: Calling all_plugins_play to load vars for managed_node2 28983 1726883113.23157: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883113.23159: Calling groups_plugins_play to load vars for managed_node2 28983 1726883113.24333: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883113.26295: done with get_vars() 28983 1726883113.26319: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:45:13 -0400 (0:00:00.116) 0:02:23.261 ****** 28983 1726883113.26381: entering _queue_task() for managed_node2/setup 28983 1726883113.26646: worker is 1 (out of 1 available) 28983 1726883113.26661: exiting _queue_task() for managed_node2/setup 28983 1726883113.26676: done queuing things up, now waiting for results queue to drain 28983 1726883113.26679: waiting for pending results... 28983 1726883113.26885: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 28983 1726883113.27010: in run() - task 0affe814-3a2d-b16d-c0a7-00000000237f 28983 1726883113.27026: variable 'ansible_search_path' from source: unknown 28983 1726883113.27029: variable 'ansible_search_path' from source: unknown 28983 1726883113.27062: calling self._execute() 28983 1726883113.27151: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883113.27157: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883113.27168: variable 'omit' from source: magic vars 28983 1726883113.27506: variable 'ansible_distribution_major_version' from source: facts 28983 1726883113.27516: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883113.27711: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883113.29603: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883113.29665: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883113.29701: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883113.29731: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883113.29759: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883113.29826: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883113.29857: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883113.29880: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883113.29912: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883113.29925: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883113.29977: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883113.29998: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883113.30019: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883113.30051: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883113.30066: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883113.30202: variable '__network_required_facts' from source: role '' defaults 28983 1726883113.30211: variable 'ansible_facts' from source: unknown 28983 1726883113.30892: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 28983 1726883113.30897: when evaluation is False, skipping this task 28983 1726883113.30900: _execute() done 28983 1726883113.30903: dumping result to json 28983 1726883113.30908: done dumping result, returning 28983 1726883113.30915: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affe814-3a2d-b16d-c0a7-00000000237f] 28983 1726883113.30921: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000237f 28983 1726883113.31015: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000237f 28983 1726883113.31019: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28983 1726883113.31069: no more pending results, returning what we have 28983 1726883113.31073: results queue empty 28983 1726883113.31074: checking for any_errors_fatal 28983 1726883113.31076: done checking for any_errors_fatal 28983 1726883113.31077: checking for max_fail_percentage 28983 1726883113.31079: done checking for max_fail_percentage 28983 1726883113.31080: checking to see if all hosts have failed and the running result is not ok 28983 1726883113.31081: done checking to see if all hosts have failed 28983 1726883113.31082: getting the remaining hosts for this loop 28983 1726883113.31085: done getting the remaining hosts for this loop 28983 1726883113.31089: getting the next task for host managed_node2 28983 1726883113.31102: done getting next task for host managed_node2 28983 1726883113.31106: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 28983 1726883113.31113: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883113.31142: getting variables 28983 1726883113.31144: in VariableManager get_vars() 28983 1726883113.31189: Calling all_inventory to load vars for managed_node2 28983 1726883113.31193: Calling groups_inventory to load vars for managed_node2 28983 1726883113.31195: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883113.31204: Calling all_plugins_play to load vars for managed_node2 28983 1726883113.31208: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883113.31217: Calling groups_plugins_play to load vars for managed_node2 28983 1726883113.32621: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883113.34253: done with get_vars() 28983 1726883113.34284: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:45:13 -0400 (0:00:00.079) 0:02:23.341 ****** 28983 1726883113.34374: entering _queue_task() for managed_node2/stat 28983 1726883113.34661: worker is 1 (out of 1 available) 28983 1726883113.34682: exiting _queue_task() for managed_node2/stat 28983 1726883113.34696: done queuing things up, now waiting for results queue to drain 28983 1726883113.34698: waiting for pending results... 28983 1726883113.34910: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 28983 1726883113.35057: in run() - task 0affe814-3a2d-b16d-c0a7-000000002381 28983 1726883113.35073: variable 'ansible_search_path' from source: unknown 28983 1726883113.35077: variable 'ansible_search_path' from source: unknown 28983 1726883113.35108: calling self._execute() 28983 1726883113.35198: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883113.35204: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883113.35215: variable 'omit' from source: magic vars 28983 1726883113.35555: variable 'ansible_distribution_major_version' from source: facts 28983 1726883113.35566: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883113.35712: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883113.35950: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883113.35989: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883113.36019: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883113.36054: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883113.36183: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883113.36204: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883113.36225: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883113.36253: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883113.36331: variable '__network_is_ostree' from source: set_fact 28983 1726883113.36344: Evaluated conditional (not __network_is_ostree is defined): False 28983 1726883113.36349: when evaluation is False, skipping this task 28983 1726883113.36352: _execute() done 28983 1726883113.36354: dumping result to json 28983 1726883113.36357: done dumping result, returning 28983 1726883113.36364: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affe814-3a2d-b16d-c0a7-000000002381] 28983 1726883113.36369: sending task result for task 0affe814-3a2d-b16d-c0a7-000000002381 28983 1726883113.36465: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000002381 28983 1726883113.36470: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 28983 1726883113.36530: no more pending results, returning what we have 28983 1726883113.36536: results queue empty 28983 1726883113.36538: checking for any_errors_fatal 28983 1726883113.36547: done checking for any_errors_fatal 28983 1726883113.36548: checking for max_fail_percentage 28983 1726883113.36550: done checking for max_fail_percentage 28983 1726883113.36551: checking to see if all hosts have failed and the running result is not ok 28983 1726883113.36552: done checking to see if all hosts have failed 28983 1726883113.36553: getting the remaining hosts for this loop 28983 1726883113.36556: done getting the remaining hosts for this loop 28983 1726883113.36560: getting the next task for host managed_node2 28983 1726883113.36570: done getting next task for host managed_node2 28983 1726883113.36576: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 28983 1726883113.36584: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883113.36608: getting variables 28983 1726883113.36609: in VariableManager get_vars() 28983 1726883113.36659: Calling all_inventory to load vars for managed_node2 28983 1726883113.36662: Calling groups_inventory to load vars for managed_node2 28983 1726883113.36665: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883113.36676: Calling all_plugins_play to load vars for managed_node2 28983 1726883113.36679: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883113.36683: Calling groups_plugins_play to load vars for managed_node2 28983 1726883113.37945: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883113.39672: done with get_vars() 28983 1726883113.39698: done getting variables 28983 1726883113.39747: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:45:13 -0400 (0:00:00.054) 0:02:23.395 ****** 28983 1726883113.39780: entering _queue_task() for managed_node2/set_fact 28983 1726883113.40029: worker is 1 (out of 1 available) 28983 1726883113.40045: exiting _queue_task() for managed_node2/set_fact 28983 1726883113.40059: done queuing things up, now waiting for results queue to drain 28983 1726883113.40061: waiting for pending results... 28983 1726883113.40254: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 28983 1726883113.40388: in run() - task 0affe814-3a2d-b16d-c0a7-000000002382 28983 1726883113.40406: variable 'ansible_search_path' from source: unknown 28983 1726883113.40409: variable 'ansible_search_path' from source: unknown 28983 1726883113.40441: calling self._execute() 28983 1726883113.40530: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883113.40538: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883113.40549: variable 'omit' from source: magic vars 28983 1726883113.40874: variable 'ansible_distribution_major_version' from source: facts 28983 1726883113.40884: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883113.41026: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883113.41257: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883113.41299: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883113.41327: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883113.41358: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883113.41462: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883113.41486: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883113.41511: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883113.41532: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883113.41612: variable '__network_is_ostree' from source: set_fact 28983 1726883113.41619: Evaluated conditional (not __network_is_ostree is defined): False 28983 1726883113.41623: when evaluation is False, skipping this task 28983 1726883113.41625: _execute() done 28983 1726883113.41630: dumping result to json 28983 1726883113.41636: done dumping result, returning 28983 1726883113.41645: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affe814-3a2d-b16d-c0a7-000000002382] 28983 1726883113.41650: sending task result for task 0affe814-3a2d-b16d-c0a7-000000002382 skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 28983 1726883113.41802: no more pending results, returning what we have 28983 1726883113.41806: results queue empty 28983 1726883113.41807: checking for any_errors_fatal 28983 1726883113.41812: done checking for any_errors_fatal 28983 1726883113.41813: checking for max_fail_percentage 28983 1726883113.41815: done checking for max_fail_percentage 28983 1726883113.41817: checking to see if all hosts have failed and the running result is not ok 28983 1726883113.41818: done checking to see if all hosts have failed 28983 1726883113.41819: getting the remaining hosts for this loop 28983 1726883113.41821: done getting the remaining hosts for this loop 28983 1726883113.41825: getting the next task for host managed_node2 28983 1726883113.41838: done getting next task for host managed_node2 28983 1726883113.41843: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 28983 1726883113.41851: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883113.41875: getting variables 28983 1726883113.41877: in VariableManager get_vars() 28983 1726883113.41916: Calling all_inventory to load vars for managed_node2 28983 1726883113.41920: Calling groups_inventory to load vars for managed_node2 28983 1726883113.41923: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883113.41931: Calling all_plugins_play to load vars for managed_node2 28983 1726883113.41942: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883113.41948: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000002382 28983 1726883113.41952: Calling groups_plugins_play to load vars for managed_node2 28983 1726883113.42472: WORKER PROCESS EXITING 28983 1726883113.43178: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883113.45936: done with get_vars() 28983 1726883113.45974: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:45:13 -0400 (0:00:00.063) 0:02:23.458 ****** 28983 1726883113.46098: entering _queue_task() for managed_node2/service_facts 28983 1726883113.46389: worker is 1 (out of 1 available) 28983 1726883113.46403: exiting _queue_task() for managed_node2/service_facts 28983 1726883113.46418: done queuing things up, now waiting for results queue to drain 28983 1726883113.46420: waiting for pending results... 28983 1726883113.46859: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running 28983 1726883113.47000: in run() - task 0affe814-3a2d-b16d-c0a7-000000002384 28983 1726883113.47028: variable 'ansible_search_path' from source: unknown 28983 1726883113.47042: variable 'ansible_search_path' from source: unknown 28983 1726883113.47095: calling self._execute() 28983 1726883113.47229: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883113.47248: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883113.47269: variable 'omit' from source: magic vars 28983 1726883113.47776: variable 'ansible_distribution_major_version' from source: facts 28983 1726883113.47941: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883113.47945: variable 'omit' from source: magic vars 28983 1726883113.47948: variable 'omit' from source: magic vars 28983 1726883113.47982: variable 'omit' from source: magic vars 28983 1726883113.48038: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883113.48095: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883113.48125: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883113.48156: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883113.48183: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883113.48229: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883113.48233: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883113.48239: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883113.48325: Set connection var ansible_connection to ssh 28983 1726883113.48338: Set connection var ansible_shell_executable to /bin/sh 28983 1726883113.48347: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883113.48356: Set connection var ansible_timeout to 10 28983 1726883113.48362: Set connection var ansible_pipelining to False 28983 1726883113.48365: Set connection var ansible_shell_type to sh 28983 1726883113.48390: variable 'ansible_shell_executable' from source: unknown 28983 1726883113.48393: variable 'ansible_connection' from source: unknown 28983 1726883113.48396: variable 'ansible_module_compression' from source: unknown 28983 1726883113.48401: variable 'ansible_shell_type' from source: unknown 28983 1726883113.48403: variable 'ansible_shell_executable' from source: unknown 28983 1726883113.48407: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883113.48412: variable 'ansible_pipelining' from source: unknown 28983 1726883113.48415: variable 'ansible_timeout' from source: unknown 28983 1726883113.48423: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883113.48592: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28983 1726883113.48607: variable 'omit' from source: magic vars 28983 1726883113.48611: starting attempt loop 28983 1726883113.48613: running the handler 28983 1726883113.48626: _low_level_execute_command(): starting 28983 1726883113.48636: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726883113.49150: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883113.49154: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883113.49157: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883113.49160: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883113.49216: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883113.49224: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883113.49226: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883113.49303: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883113.51085: stdout chunk (state=3): >>>/root <<< 28983 1726883113.51192: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883113.51242: stderr chunk (state=3): >>><<< 28983 1726883113.51245: stdout chunk (state=3): >>><<< 28983 1726883113.51264: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883113.51276: _low_level_execute_command(): starting 28983 1726883113.51283: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726883113.5126479-34221-80681665719150 `" && echo ansible-tmp-1726883113.5126479-34221-80681665719150="` echo /root/.ansible/tmp/ansible-tmp-1726883113.5126479-34221-80681665719150 `" ) && sleep 0' 28983 1726883113.51702: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883113.51740: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883113.51743: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726883113.51746: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 28983 1726883113.51758: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found <<< 28983 1726883113.51761: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883113.51810: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883113.51814: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883113.51892: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883113.53894: stdout chunk (state=3): >>>ansible-tmp-1726883113.5126479-34221-80681665719150=/root/.ansible/tmp/ansible-tmp-1726883113.5126479-34221-80681665719150 <<< 28983 1726883113.54010: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883113.54060: stderr chunk (state=3): >>><<< 28983 1726883113.54064: stdout chunk (state=3): >>><<< 28983 1726883113.54081: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726883113.5126479-34221-80681665719150=/root/.ansible/tmp/ansible-tmp-1726883113.5126479-34221-80681665719150 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883113.54117: variable 'ansible_module_compression' from source: unknown 28983 1726883113.54158: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 28983 1726883113.54193: variable 'ansible_facts' from source: unknown 28983 1726883113.54256: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726883113.5126479-34221-80681665719150/AnsiballZ_service_facts.py 28983 1726883113.54366: Sending initial data 28983 1726883113.54370: Sent initial data (161 bytes) 28983 1726883113.54794: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883113.54838: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883113.54842: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883113.54844: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883113.54847: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883113.54901: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883113.54904: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883113.54906: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883113.54972: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883113.56605: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 28983 1726883113.56614: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726883113.56677: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726883113.56744: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpyi27kowg /root/.ansible/tmp/ansible-tmp-1726883113.5126479-34221-80681665719150/AnsiballZ_service_facts.py <<< 28983 1726883113.56753: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726883113.5126479-34221-80681665719150/AnsiballZ_service_facts.py" <<< 28983 1726883113.56813: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpyi27kowg" to remote "/root/.ansible/tmp/ansible-tmp-1726883113.5126479-34221-80681665719150/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726883113.5126479-34221-80681665719150/AnsiballZ_service_facts.py" <<< 28983 1726883113.57749: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883113.57811: stderr chunk (state=3): >>><<< 28983 1726883113.57814: stdout chunk (state=3): >>><<< 28983 1726883113.57830: done transferring module to remote 28983 1726883113.57841: _low_level_execute_command(): starting 28983 1726883113.57847: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726883113.5126479-34221-80681665719150/ /root/.ansible/tmp/ansible-tmp-1726883113.5126479-34221-80681665719150/AnsiballZ_service_facts.py && sleep 0' 28983 1726883113.58302: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883113.58305: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726883113.58308: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883113.58312: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883113.58317: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883113.58367: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883113.58371: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883113.58449: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883113.60309: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883113.60354: stderr chunk (state=3): >>><<< 28983 1726883113.60358: stdout chunk (state=3): >>><<< 28983 1726883113.60378: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883113.60381: _low_level_execute_command(): starting 28983 1726883113.60387: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726883113.5126479-34221-80681665719150/AnsiballZ_service_facts.py && sleep 0' 28983 1726883113.60791: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883113.60830: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883113.60836: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883113.60839: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883113.60841: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883113.60894: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883113.60897: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883113.60967: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883115.58173: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"network": {"name": "network", "state": "running", "status": "enabled", "source": "sysv"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "import-state.service": {"name": "import-state.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "ip6tables.service": {"name": "ip6tables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "iptables.service": {"name": "iptables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "generated", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state":<<< 28983 1726883115.58186: stdout chunk (state=3): >>> "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "sourc<<< 28983 1726883115.58227: stdout chunk (state=3): >>>e": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "stat<<< 28983 1726883115.58241: stdout chunk (state=3): >>>ic", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "loadmodules.service": {"name": "loadmodules.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "stati<<< 28983 1726883115.58265: stdout chunk (state=3): >>>c", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 28983 1726883115.59838: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. <<< 28983 1726883115.59899: stderr chunk (state=3): >>><<< 28983 1726883115.59903: stdout chunk (state=3): >>><<< 28983 1726883115.59941: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"network": {"name": "network", "state": "running", "status": "enabled", "source": "sysv"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "import-state.service": {"name": "import-state.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "ip6tables.service": {"name": "ip6tables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "iptables.service": {"name": "iptables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "generated", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "loadmodules.service": {"name": "loadmodules.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726883115.60985: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726883113.5126479-34221-80681665719150/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726883115.60995: _low_level_execute_command(): starting 28983 1726883115.61004: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726883113.5126479-34221-80681665719150/ > /dev/null 2>&1 && sleep 0' 28983 1726883115.61491: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883115.61495: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726883115.61497: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 28983 1726883115.61500: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883115.61502: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883115.61555: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883115.61562: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883115.61630: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883115.63619: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883115.63671: stderr chunk (state=3): >>><<< 28983 1726883115.63676: stdout chunk (state=3): >>><<< 28983 1726883115.63692: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883115.63699: handler run complete 28983 1726883115.63874: variable 'ansible_facts' from source: unknown 28983 1726883115.64005: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883115.64452: variable 'ansible_facts' from source: unknown 28983 1726883115.64582: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883115.64778: attempt loop complete, returning result 28983 1726883115.64785: _execute() done 28983 1726883115.64788: dumping result to json 28983 1726883115.64842: done dumping result, returning 28983 1726883115.64849: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running [0affe814-3a2d-b16d-c0a7-000000002384] 28983 1726883115.64855: sending task result for task 0affe814-3a2d-b16d-c0a7-000000002384 28983 1726883115.65853: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000002384 28983 1726883115.65856: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28983 1726883115.65917: no more pending results, returning what we have 28983 1726883115.65919: results queue empty 28983 1726883115.65920: checking for any_errors_fatal 28983 1726883115.65923: done checking for any_errors_fatal 28983 1726883115.65924: checking for max_fail_percentage 28983 1726883115.65925: done checking for max_fail_percentage 28983 1726883115.65926: checking to see if all hosts have failed and the running result is not ok 28983 1726883115.65926: done checking to see if all hosts have failed 28983 1726883115.65927: getting the remaining hosts for this loop 28983 1726883115.65928: done getting the remaining hosts for this loop 28983 1726883115.65931: getting the next task for host managed_node2 28983 1726883115.65938: done getting next task for host managed_node2 28983 1726883115.65941: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 28983 1726883115.65947: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883115.65955: getting variables 28983 1726883115.65956: in VariableManager get_vars() 28983 1726883115.65984: Calling all_inventory to load vars for managed_node2 28983 1726883115.65987: Calling groups_inventory to load vars for managed_node2 28983 1726883115.65988: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883115.65996: Calling all_plugins_play to load vars for managed_node2 28983 1726883115.65998: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883115.66000: Calling groups_plugins_play to load vars for managed_node2 28983 1726883115.67146: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883115.68744: done with get_vars() 28983 1726883115.68767: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:45:15 -0400 (0:00:02.227) 0:02:25.686 ****** 28983 1726883115.68852: entering _queue_task() for managed_node2/package_facts 28983 1726883115.69104: worker is 1 (out of 1 available) 28983 1726883115.69118: exiting _queue_task() for managed_node2/package_facts 28983 1726883115.69133: done queuing things up, now waiting for results queue to drain 28983 1726883115.69137: waiting for pending results... 28983 1726883115.69349: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 28983 1726883115.69486: in run() - task 0affe814-3a2d-b16d-c0a7-000000002385 28983 1726883115.69501: variable 'ansible_search_path' from source: unknown 28983 1726883115.69505: variable 'ansible_search_path' from source: unknown 28983 1726883115.69536: calling self._execute() 28983 1726883115.69626: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883115.69632: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883115.69646: variable 'omit' from source: magic vars 28983 1726883115.69986: variable 'ansible_distribution_major_version' from source: facts 28983 1726883115.69997: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883115.70004: variable 'omit' from source: magic vars 28983 1726883115.70070: variable 'omit' from source: magic vars 28983 1726883115.70100: variable 'omit' from source: magic vars 28983 1726883115.70140: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883115.70172: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883115.70193: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883115.70209: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883115.70220: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883115.70255: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883115.70258: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883115.70263: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883115.70349: Set connection var ansible_connection to ssh 28983 1726883115.70360: Set connection var ansible_shell_executable to /bin/sh 28983 1726883115.70369: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883115.70380: Set connection var ansible_timeout to 10 28983 1726883115.70387: Set connection var ansible_pipelining to False 28983 1726883115.70389: Set connection var ansible_shell_type to sh 28983 1726883115.70410: variable 'ansible_shell_executable' from source: unknown 28983 1726883115.70413: variable 'ansible_connection' from source: unknown 28983 1726883115.70416: variable 'ansible_module_compression' from source: unknown 28983 1726883115.70420: variable 'ansible_shell_type' from source: unknown 28983 1726883115.70423: variable 'ansible_shell_executable' from source: unknown 28983 1726883115.70428: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883115.70432: variable 'ansible_pipelining' from source: unknown 28983 1726883115.70437: variable 'ansible_timeout' from source: unknown 28983 1726883115.70443: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883115.70615: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28983 1726883115.70627: variable 'omit' from source: magic vars 28983 1726883115.70633: starting attempt loop 28983 1726883115.70638: running the handler 28983 1726883115.70652: _low_level_execute_command(): starting 28983 1726883115.70660: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726883115.71210: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883115.71214: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 28983 1726883115.71218: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found <<< 28983 1726883115.71222: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883115.71287: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883115.71289: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883115.71291: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883115.71362: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883115.73123: stdout chunk (state=3): >>>/root <<< 28983 1726883115.73238: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883115.73289: stderr chunk (state=3): >>><<< 28983 1726883115.73293: stdout chunk (state=3): >>><<< 28983 1726883115.73313: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883115.73329: _low_level_execute_command(): starting 28983 1726883115.73335: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726883115.7331264-34254-268393817133017 `" && echo ansible-tmp-1726883115.7331264-34254-268393817133017="` echo /root/.ansible/tmp/ansible-tmp-1726883115.7331264-34254-268393817133017 `" ) && sleep 0' 28983 1726883115.73781: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883115.73784: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883115.73787: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883115.73796: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883115.73849: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883115.73856: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883115.73931: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883115.75911: stdout chunk (state=3): >>>ansible-tmp-1726883115.7331264-34254-268393817133017=/root/.ansible/tmp/ansible-tmp-1726883115.7331264-34254-268393817133017 <<< 28983 1726883115.76025: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883115.76073: stderr chunk (state=3): >>><<< 28983 1726883115.76080: stdout chunk (state=3): >>><<< 28983 1726883115.76096: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726883115.7331264-34254-268393817133017=/root/.ansible/tmp/ansible-tmp-1726883115.7331264-34254-268393817133017 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883115.76130: variable 'ansible_module_compression' from source: unknown 28983 1726883115.76174: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 28983 1726883115.76227: variable 'ansible_facts' from source: unknown 28983 1726883115.76360: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726883115.7331264-34254-268393817133017/AnsiballZ_package_facts.py 28983 1726883115.76472: Sending initial data 28983 1726883115.76476: Sent initial data (162 bytes) 28983 1726883115.76932: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883115.76935: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726883115.76938: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration <<< 28983 1726883115.76941: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726883115.76944: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883115.76995: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883115.77000: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883115.77075: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883115.78713: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 28983 1726883115.78721: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726883115.78783: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726883115.78854: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpxfpgz26f /root/.ansible/tmp/ansible-tmp-1726883115.7331264-34254-268393817133017/AnsiballZ_package_facts.py <<< 28983 1726883115.78861: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726883115.7331264-34254-268393817133017/AnsiballZ_package_facts.py" <<< 28983 1726883115.78918: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpxfpgz26f" to remote "/root/.ansible/tmp/ansible-tmp-1726883115.7331264-34254-268393817133017/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726883115.7331264-34254-268393817133017/AnsiballZ_package_facts.py" <<< 28983 1726883115.80749: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883115.80805: stderr chunk (state=3): >>><<< 28983 1726883115.80809: stdout chunk (state=3): >>><<< 28983 1726883115.80826: done transferring module to remote 28983 1726883115.80837: _low_level_execute_command(): starting 28983 1726883115.80842: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726883115.7331264-34254-268393817133017/ /root/.ansible/tmp/ansible-tmp-1726883115.7331264-34254-268393817133017/AnsiballZ_package_facts.py && sleep 0' 28983 1726883115.81300: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883115.81304: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726883115.81306: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883115.81308: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883115.81311: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883115.81365: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883115.81372: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883115.81444: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883115.83328: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883115.83376: stderr chunk (state=3): >>><<< 28983 1726883115.83379: stdout chunk (state=3): >>><<< 28983 1726883115.83392: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883115.83395: _low_level_execute_command(): starting 28983 1726883115.83401: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726883115.7331264-34254-268393817133017/AnsiballZ_package_facts.py && sleep 0' 28983 1726883115.83824: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883115.83827: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883115.83829: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883115.83832: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883115.83889: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883115.83894: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883115.83968: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883116.47748: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "r<<< 28983 1726883116.47775: stdout chunk (state=3): >>>pm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": <<< 28983 1726883116.47786: stdout chunk (state=3): >>>"rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "relea<<< 28983 1726883116.47796: stdout chunk (state=3): >>>se": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, <<< 28983 1726883116.47843: stdout chunk (state=3): >>>"arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "<<< 28983 1726883116.47859: stdout chunk (state=3): >>>version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.fc3<<< 28983 1726883116.47892: stdout chunk (state=3): >>>9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", <<< 28983 1726883116.47912: stdout chunk (state=3): >>>"release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "sou<<< 28983 1726883116.47936: stdout chunk (state=3): >>>rce": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch": "n<<< 28983 1726883116.47957: stdout chunk (state=3): >>>oarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": null, "a<<< 28983 1726883116.47986: stdout chunk (state=3): >>>rch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "9.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chkconfig": [{"name": "chkconfig", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts": [{"name": "initscripts", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "network-scripts": [{"name": "network-scripts", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 28983 1726883116.49881: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. <<< 28983 1726883116.49947: stderr chunk (state=3): >>><<< 28983 1726883116.49964: stdout chunk (state=3): >>><<< 28983 1726883116.50075: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "9.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chkconfig": [{"name": "chkconfig", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts": [{"name": "initscripts", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "network-scripts": [{"name": "network-scripts", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726883116.52807: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726883115.7331264-34254-268393817133017/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726883116.52828: _low_level_execute_command(): starting 28983 1726883116.52832: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726883115.7331264-34254-268393817133017/ > /dev/null 2>&1 && sleep 0' 28983 1726883116.53470: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883116.53509: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883116.53513: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 28983 1726883116.53515: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883116.53518: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883116.53572: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883116.53587: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883116.53610: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883116.53715: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883116.55777: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883116.55790: stdout chunk (state=3): >>><<< 28983 1726883116.55805: stderr chunk (state=3): >>><<< 28983 1726883116.55826: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883116.55843: handler run complete 28983 1726883116.57319: variable 'ansible_facts' from source: unknown 28983 1726883116.57995: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883116.60014: variable 'ansible_facts' from source: unknown 28983 1726883116.60454: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883116.61219: attempt loop complete, returning result 28983 1726883116.61237: _execute() done 28983 1726883116.61240: dumping result to json 28983 1726883116.61426: done dumping result, returning 28983 1726883116.61435: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affe814-3a2d-b16d-c0a7-000000002385] 28983 1726883116.61441: sending task result for task 0affe814-3a2d-b16d-c0a7-000000002385 28983 1726883116.63635: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000002385 28983 1726883116.63639: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28983 1726883116.63743: no more pending results, returning what we have 28983 1726883116.63745: results queue empty 28983 1726883116.63746: checking for any_errors_fatal 28983 1726883116.63750: done checking for any_errors_fatal 28983 1726883116.63750: checking for max_fail_percentage 28983 1726883116.63752: done checking for max_fail_percentage 28983 1726883116.63753: checking to see if all hosts have failed and the running result is not ok 28983 1726883116.63753: done checking to see if all hosts have failed 28983 1726883116.63754: getting the remaining hosts for this loop 28983 1726883116.63755: done getting the remaining hosts for this loop 28983 1726883116.63758: getting the next task for host managed_node2 28983 1726883116.63764: done getting next task for host managed_node2 28983 1726883116.63767: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 28983 1726883116.63772: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883116.63781: getting variables 28983 1726883116.63782: in VariableManager get_vars() 28983 1726883116.63812: Calling all_inventory to load vars for managed_node2 28983 1726883116.63815: Calling groups_inventory to load vars for managed_node2 28983 1726883116.63816: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883116.63824: Calling all_plugins_play to load vars for managed_node2 28983 1726883116.63826: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883116.63828: Calling groups_plugins_play to load vars for managed_node2 28983 1726883116.64974: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883116.66650: done with get_vars() 28983 1726883116.66674: done getting variables 28983 1726883116.66729: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:45:16 -0400 (0:00:00.979) 0:02:26.665 ****** 28983 1726883116.66760: entering _queue_task() for managed_node2/debug 28983 1726883116.67011: worker is 1 (out of 1 available) 28983 1726883116.67026: exiting _queue_task() for managed_node2/debug 28983 1726883116.67041: done queuing things up, now waiting for results queue to drain 28983 1726883116.67043: waiting for pending results... 28983 1726883116.67254: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 28983 1726883116.67383: in run() - task 0affe814-3a2d-b16d-c0a7-000000002329 28983 1726883116.67400: variable 'ansible_search_path' from source: unknown 28983 1726883116.67404: variable 'ansible_search_path' from source: unknown 28983 1726883116.67437: calling self._execute() 28983 1726883116.67526: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883116.67535: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883116.67546: variable 'omit' from source: magic vars 28983 1726883116.67892: variable 'ansible_distribution_major_version' from source: facts 28983 1726883116.67902: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883116.67909: variable 'omit' from source: magic vars 28983 1726883116.67964: variable 'omit' from source: magic vars 28983 1726883116.68047: variable 'network_provider' from source: set_fact 28983 1726883116.68064: variable 'omit' from source: magic vars 28983 1726883116.68104: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883116.68137: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883116.68158: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883116.68177: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883116.68187: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883116.68216: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883116.68220: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883116.68225: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883116.68312: Set connection var ansible_connection to ssh 28983 1726883116.68322: Set connection var ansible_shell_executable to /bin/sh 28983 1726883116.68331: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883116.68341: Set connection var ansible_timeout to 10 28983 1726883116.68347: Set connection var ansible_pipelining to False 28983 1726883116.68350: Set connection var ansible_shell_type to sh 28983 1726883116.68377: variable 'ansible_shell_executable' from source: unknown 28983 1726883116.68381: variable 'ansible_connection' from source: unknown 28983 1726883116.68384: variable 'ansible_module_compression' from source: unknown 28983 1726883116.68386: variable 'ansible_shell_type' from source: unknown 28983 1726883116.68390: variable 'ansible_shell_executable' from source: unknown 28983 1726883116.68394: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883116.68399: variable 'ansible_pipelining' from source: unknown 28983 1726883116.68402: variable 'ansible_timeout' from source: unknown 28983 1726883116.68407: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883116.68528: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883116.68539: variable 'omit' from source: magic vars 28983 1726883116.68545: starting attempt loop 28983 1726883116.68549: running the handler 28983 1726883116.68595: handler run complete 28983 1726883116.68609: attempt loop complete, returning result 28983 1726883116.68612: _execute() done 28983 1726883116.68615: dumping result to json 28983 1726883116.68620: done dumping result, returning 28983 1726883116.68628: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [0affe814-3a2d-b16d-c0a7-000000002329] 28983 1726883116.68635: sending task result for task 0affe814-3a2d-b16d-c0a7-000000002329 28983 1726883116.68725: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000002329 28983 1726883116.68728: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: Using network provider: nm 28983 1726883116.68810: no more pending results, returning what we have 28983 1726883116.68814: results queue empty 28983 1726883116.68815: checking for any_errors_fatal 28983 1726883116.68822: done checking for any_errors_fatal 28983 1726883116.68823: checking for max_fail_percentage 28983 1726883116.68824: done checking for max_fail_percentage 28983 1726883116.68825: checking to see if all hosts have failed and the running result is not ok 28983 1726883116.68826: done checking to see if all hosts have failed 28983 1726883116.68827: getting the remaining hosts for this loop 28983 1726883116.68829: done getting the remaining hosts for this loop 28983 1726883116.68836: getting the next task for host managed_node2 28983 1726883116.68844: done getting next task for host managed_node2 28983 1726883116.68855: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 28983 1726883116.68861: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883116.68875: getting variables 28983 1726883116.68876: in VariableManager get_vars() 28983 1726883116.68917: Calling all_inventory to load vars for managed_node2 28983 1726883116.68921: Calling groups_inventory to load vars for managed_node2 28983 1726883116.68924: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883116.68932: Calling all_plugins_play to load vars for managed_node2 28983 1726883116.68937: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883116.68941: Calling groups_plugins_play to load vars for managed_node2 28983 1726883116.70139: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883116.71748: done with get_vars() 28983 1726883116.71771: done getting variables 28983 1726883116.71816: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:45:16 -0400 (0:00:00.050) 0:02:26.716 ****** 28983 1726883116.71850: entering _queue_task() for managed_node2/fail 28983 1726883116.72066: worker is 1 (out of 1 available) 28983 1726883116.72080: exiting _queue_task() for managed_node2/fail 28983 1726883116.72093: done queuing things up, now waiting for results queue to drain 28983 1726883116.72095: waiting for pending results... 28983 1726883116.72291: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 28983 1726883116.72410: in run() - task 0affe814-3a2d-b16d-c0a7-00000000232a 28983 1726883116.72423: variable 'ansible_search_path' from source: unknown 28983 1726883116.72429: variable 'ansible_search_path' from source: unknown 28983 1726883116.72460: calling self._execute() 28983 1726883116.72544: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883116.72555: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883116.72563: variable 'omit' from source: magic vars 28983 1726883116.72892: variable 'ansible_distribution_major_version' from source: facts 28983 1726883116.72902: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883116.73006: variable 'network_state' from source: role '' defaults 28983 1726883116.73017: Evaluated conditional (network_state != {}): False 28983 1726883116.73021: when evaluation is False, skipping this task 28983 1726883116.73024: _execute() done 28983 1726883116.73026: dumping result to json 28983 1726883116.73031: done dumping result, returning 28983 1726883116.73040: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affe814-3a2d-b16d-c0a7-00000000232a] 28983 1726883116.73046: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000232a 28983 1726883116.73144: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000232a 28983 1726883116.73147: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28983 1726883116.73203: no more pending results, returning what we have 28983 1726883116.73207: results queue empty 28983 1726883116.73208: checking for any_errors_fatal 28983 1726883116.73214: done checking for any_errors_fatal 28983 1726883116.73215: checking for max_fail_percentage 28983 1726883116.73217: done checking for max_fail_percentage 28983 1726883116.73218: checking to see if all hosts have failed and the running result is not ok 28983 1726883116.73219: done checking to see if all hosts have failed 28983 1726883116.73220: getting the remaining hosts for this loop 28983 1726883116.73222: done getting the remaining hosts for this loop 28983 1726883116.73226: getting the next task for host managed_node2 28983 1726883116.73236: done getting next task for host managed_node2 28983 1726883116.73241: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 28983 1726883116.73247: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883116.73277: getting variables 28983 1726883116.73278: in VariableManager get_vars() 28983 1726883116.73316: Calling all_inventory to load vars for managed_node2 28983 1726883116.73319: Calling groups_inventory to load vars for managed_node2 28983 1726883116.73320: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883116.73327: Calling all_plugins_play to load vars for managed_node2 28983 1726883116.73329: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883116.73331: Calling groups_plugins_play to load vars for managed_node2 28983 1726883116.74666: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883116.76237: done with get_vars() 28983 1726883116.76260: done getting variables 28983 1726883116.76307: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:45:16 -0400 (0:00:00.044) 0:02:26.761 ****** 28983 1726883116.76336: entering _queue_task() for managed_node2/fail 28983 1726883116.76542: worker is 1 (out of 1 available) 28983 1726883116.76557: exiting _queue_task() for managed_node2/fail 28983 1726883116.76571: done queuing things up, now waiting for results queue to drain 28983 1726883116.76573: waiting for pending results... 28983 1726883116.76766: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 28983 1726883116.76890: in run() - task 0affe814-3a2d-b16d-c0a7-00000000232b 28983 1726883116.76904: variable 'ansible_search_path' from source: unknown 28983 1726883116.76908: variable 'ansible_search_path' from source: unknown 28983 1726883116.76940: calling self._execute() 28983 1726883116.77021: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883116.77028: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883116.77040: variable 'omit' from source: magic vars 28983 1726883116.77353: variable 'ansible_distribution_major_version' from source: facts 28983 1726883116.77362: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883116.77466: variable 'network_state' from source: role '' defaults 28983 1726883116.77480: Evaluated conditional (network_state != {}): False 28983 1726883116.77483: when evaluation is False, skipping this task 28983 1726883116.77486: _execute() done 28983 1726883116.77489: dumping result to json 28983 1726883116.77494: done dumping result, returning 28983 1726883116.77502: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affe814-3a2d-b16d-c0a7-00000000232b] 28983 1726883116.77507: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000232b 28983 1726883116.77607: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000232b 28983 1726883116.77610: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28983 1726883116.77663: no more pending results, returning what we have 28983 1726883116.77667: results queue empty 28983 1726883116.77668: checking for any_errors_fatal 28983 1726883116.77674: done checking for any_errors_fatal 28983 1726883116.77675: checking for max_fail_percentage 28983 1726883116.77677: done checking for max_fail_percentage 28983 1726883116.77678: checking to see if all hosts have failed and the running result is not ok 28983 1726883116.77680: done checking to see if all hosts have failed 28983 1726883116.77681: getting the remaining hosts for this loop 28983 1726883116.77683: done getting the remaining hosts for this loop 28983 1726883116.77687: getting the next task for host managed_node2 28983 1726883116.77694: done getting next task for host managed_node2 28983 1726883116.77699: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 28983 1726883116.77705: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883116.77725: getting variables 28983 1726883116.77727: in VariableManager get_vars() 28983 1726883116.77772: Calling all_inventory to load vars for managed_node2 28983 1726883116.77775: Calling groups_inventory to load vars for managed_node2 28983 1726883116.77776: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883116.77783: Calling all_plugins_play to load vars for managed_node2 28983 1726883116.77785: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883116.77787: Calling groups_plugins_play to load vars for managed_node2 28983 1726883116.78977: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883116.80685: done with get_vars() 28983 1726883116.80711: done getting variables 28983 1726883116.80757: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:45:16 -0400 (0:00:00.044) 0:02:26.805 ****** 28983 1726883116.80786: entering _queue_task() for managed_node2/fail 28983 1726883116.80998: worker is 1 (out of 1 available) 28983 1726883116.81013: exiting _queue_task() for managed_node2/fail 28983 1726883116.81025: done queuing things up, now waiting for results queue to drain 28983 1726883116.81027: waiting for pending results... 28983 1726883116.81226: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 28983 1726883116.81362: in run() - task 0affe814-3a2d-b16d-c0a7-00000000232c 28983 1726883116.81367: variable 'ansible_search_path' from source: unknown 28983 1726883116.81370: variable 'ansible_search_path' from source: unknown 28983 1726883116.81390: calling self._execute() 28983 1726883116.81472: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883116.81483: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883116.81494: variable 'omit' from source: magic vars 28983 1726883116.81814: variable 'ansible_distribution_major_version' from source: facts 28983 1726883116.81825: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883116.81977: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883116.83791: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883116.83844: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883116.83880: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883116.83912: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883116.83937: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883116.84009: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883116.84031: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883116.84055: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883116.84092: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883116.84106: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883116.84189: variable 'ansible_distribution_major_version' from source: facts 28983 1726883116.84202: Evaluated conditional (ansible_distribution_major_version | int > 9): True 28983 1726883116.84298: variable 'ansible_distribution' from source: facts 28983 1726883116.84302: variable '__network_rh_distros' from source: role '' defaults 28983 1726883116.84312: Evaluated conditional (ansible_distribution in __network_rh_distros): False 28983 1726883116.84315: when evaluation is False, skipping this task 28983 1726883116.84319: _execute() done 28983 1726883116.84326: dumping result to json 28983 1726883116.84330: done dumping result, returning 28983 1726883116.84337: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affe814-3a2d-b16d-c0a7-00000000232c] 28983 1726883116.84344: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000232c 28983 1726883116.84434: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000232c 28983 1726883116.84437: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution in __network_rh_distros", "skip_reason": "Conditional result was False" } 28983 1726883116.84490: no more pending results, returning what we have 28983 1726883116.84494: results queue empty 28983 1726883116.84495: checking for any_errors_fatal 28983 1726883116.84502: done checking for any_errors_fatal 28983 1726883116.84503: checking for max_fail_percentage 28983 1726883116.84505: done checking for max_fail_percentage 28983 1726883116.84506: checking to see if all hosts have failed and the running result is not ok 28983 1726883116.84507: done checking to see if all hosts have failed 28983 1726883116.84508: getting the remaining hosts for this loop 28983 1726883116.84510: done getting the remaining hosts for this loop 28983 1726883116.84515: getting the next task for host managed_node2 28983 1726883116.84524: done getting next task for host managed_node2 28983 1726883116.84529: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 28983 1726883116.84543: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883116.84567: getting variables 28983 1726883116.84569: in VariableManager get_vars() 28983 1726883116.84611: Calling all_inventory to load vars for managed_node2 28983 1726883116.84614: Calling groups_inventory to load vars for managed_node2 28983 1726883116.84617: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883116.84625: Calling all_plugins_play to load vars for managed_node2 28983 1726883116.84628: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883116.84632: Calling groups_plugins_play to load vars for managed_node2 28983 1726883116.85874: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883116.87470: done with get_vars() 28983 1726883116.87495: done getting variables 28983 1726883116.87542: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:45:16 -0400 (0:00:00.067) 0:02:26.873 ****** 28983 1726883116.87570: entering _queue_task() for managed_node2/dnf 28983 1726883116.87794: worker is 1 (out of 1 available) 28983 1726883116.87811: exiting _queue_task() for managed_node2/dnf 28983 1726883116.87824: done queuing things up, now waiting for results queue to drain 28983 1726883116.87825: waiting for pending results... 28983 1726883116.88031: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 28983 1726883116.88157: in run() - task 0affe814-3a2d-b16d-c0a7-00000000232d 28983 1726883116.88177: variable 'ansible_search_path' from source: unknown 28983 1726883116.88181: variable 'ansible_search_path' from source: unknown 28983 1726883116.88208: calling self._execute() 28983 1726883116.88296: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883116.88302: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883116.88313: variable 'omit' from source: magic vars 28983 1726883116.88640: variable 'ansible_distribution_major_version' from source: facts 28983 1726883116.88652: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883116.88824: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883116.90930: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883116.90982: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883116.91015: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883116.91047: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883116.91069: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883116.91138: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883116.91162: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883116.91185: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883116.91217: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883116.91235: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883116.91324: variable 'ansible_distribution' from source: facts 28983 1726883116.91332: variable 'ansible_distribution_major_version' from source: facts 28983 1726883116.91348: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 28983 1726883116.91440: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883116.91557: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883116.91582: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883116.91603: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883116.91633: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883116.91650: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883116.91692: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883116.91710: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883116.91731: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883116.91765: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883116.91785: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883116.91815: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883116.91836: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883116.91856: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883116.91890: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883116.91909: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883116.92039: variable 'network_connections' from source: include params 28983 1726883116.92050: variable 'interface' from source: play vars 28983 1726883116.92104: variable 'interface' from source: play vars 28983 1726883116.92165: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883116.92310: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883116.92347: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883116.92376: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883116.92399: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883116.92441: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883116.92459: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883116.92486: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883116.92507: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883116.92555: variable '__network_team_connections_defined' from source: role '' defaults 28983 1726883116.92751: variable 'network_connections' from source: include params 28983 1726883116.92756: variable 'interface' from source: play vars 28983 1726883116.92811: variable 'interface' from source: play vars 28983 1726883116.92830: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 28983 1726883116.92833: when evaluation is False, skipping this task 28983 1726883116.92837: _execute() done 28983 1726883116.92843: dumping result to json 28983 1726883116.92850: done dumping result, returning 28983 1726883116.92856: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affe814-3a2d-b16d-c0a7-00000000232d] 28983 1726883116.92862: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000232d 28983 1726883116.92958: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000232d 28983 1726883116.92961: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 28983 1726883116.93019: no more pending results, returning what we have 28983 1726883116.93023: results queue empty 28983 1726883116.93024: checking for any_errors_fatal 28983 1726883116.93030: done checking for any_errors_fatal 28983 1726883116.93031: checking for max_fail_percentage 28983 1726883116.93033: done checking for max_fail_percentage 28983 1726883116.93035: checking to see if all hosts have failed and the running result is not ok 28983 1726883116.93036: done checking to see if all hosts have failed 28983 1726883116.93037: getting the remaining hosts for this loop 28983 1726883116.93040: done getting the remaining hosts for this loop 28983 1726883116.93044: getting the next task for host managed_node2 28983 1726883116.93053: done getting next task for host managed_node2 28983 1726883116.93058: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 28983 1726883116.93064: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883116.93090: getting variables 28983 1726883116.93092: in VariableManager get_vars() 28983 1726883116.93142: Calling all_inventory to load vars for managed_node2 28983 1726883116.93145: Calling groups_inventory to load vars for managed_node2 28983 1726883116.93148: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883116.93157: Calling all_plugins_play to load vars for managed_node2 28983 1726883116.93160: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883116.93163: Calling groups_plugins_play to load vars for managed_node2 28983 1726883116.94567: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883116.96148: done with get_vars() 28983 1726883116.96172: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 28983 1726883116.96231: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:45:16 -0400 (0:00:00.086) 0:02:26.960 ****** 28983 1726883116.96259: entering _queue_task() for managed_node2/yum 28983 1726883116.96489: worker is 1 (out of 1 available) 28983 1726883116.96505: exiting _queue_task() for managed_node2/yum 28983 1726883116.96518: done queuing things up, now waiting for results queue to drain 28983 1726883116.96520: waiting for pending results... 28983 1726883116.96707: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 28983 1726883116.96820: in run() - task 0affe814-3a2d-b16d-c0a7-00000000232e 28983 1726883116.96833: variable 'ansible_search_path' from source: unknown 28983 1726883116.96838: variable 'ansible_search_path' from source: unknown 28983 1726883116.96876: calling self._execute() 28983 1726883116.96958: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883116.96965: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883116.96982: variable 'omit' from source: magic vars 28983 1726883116.97307: variable 'ansible_distribution_major_version' from source: facts 28983 1726883116.97317: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883116.97466: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883116.99208: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883116.99259: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883116.99293: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883116.99323: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883116.99348: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883116.99416: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883116.99439: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883116.99460: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883116.99496: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883116.99510: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883116.99585: variable 'ansible_distribution_major_version' from source: facts 28983 1726883116.99598: Evaluated conditional (ansible_distribution_major_version | int < 8): False 28983 1726883116.99602: when evaluation is False, skipping this task 28983 1726883116.99604: _execute() done 28983 1726883116.99609: dumping result to json 28983 1726883116.99617: done dumping result, returning 28983 1726883116.99623: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affe814-3a2d-b16d-c0a7-00000000232e] 28983 1726883116.99628: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000232e 28983 1726883116.99728: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000232e 28983 1726883116.99732: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 28983 1726883116.99794: no more pending results, returning what we have 28983 1726883116.99797: results queue empty 28983 1726883116.99798: checking for any_errors_fatal 28983 1726883116.99805: done checking for any_errors_fatal 28983 1726883116.99806: checking for max_fail_percentage 28983 1726883116.99808: done checking for max_fail_percentage 28983 1726883116.99809: checking to see if all hosts have failed and the running result is not ok 28983 1726883116.99810: done checking to see if all hosts have failed 28983 1726883116.99811: getting the remaining hosts for this loop 28983 1726883116.99813: done getting the remaining hosts for this loop 28983 1726883116.99818: getting the next task for host managed_node2 28983 1726883116.99826: done getting next task for host managed_node2 28983 1726883116.99830: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 28983 1726883116.99838: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883116.99861: getting variables 28983 1726883116.99863: in VariableManager get_vars() 28983 1726883116.99905: Calling all_inventory to load vars for managed_node2 28983 1726883116.99908: Calling groups_inventory to load vars for managed_node2 28983 1726883116.99911: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883116.99920: Calling all_plugins_play to load vars for managed_node2 28983 1726883116.99923: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883116.99926: Calling groups_plugins_play to load vars for managed_node2 28983 1726883117.01208: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883117.07746: done with get_vars() 28983 1726883117.07774: done getting variables 28983 1726883117.07814: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:45:17 -0400 (0:00:00.115) 0:02:27.076 ****** 28983 1726883117.07840: entering _queue_task() for managed_node2/fail 28983 1726883117.08120: worker is 1 (out of 1 available) 28983 1726883117.08138: exiting _queue_task() for managed_node2/fail 28983 1726883117.08151: done queuing things up, now waiting for results queue to drain 28983 1726883117.08154: waiting for pending results... 28983 1726883117.08349: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 28983 1726883117.08509: in run() - task 0affe814-3a2d-b16d-c0a7-00000000232f 28983 1726883117.08522: variable 'ansible_search_path' from source: unknown 28983 1726883117.08526: variable 'ansible_search_path' from source: unknown 28983 1726883117.08561: calling self._execute() 28983 1726883117.08654: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883117.08663: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883117.08677: variable 'omit' from source: magic vars 28983 1726883117.09006: variable 'ansible_distribution_major_version' from source: facts 28983 1726883117.09017: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883117.09145: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883117.09415: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883117.12039: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883117.12043: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883117.12047: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883117.12092: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883117.12127: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883117.12222: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883117.12267: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883117.12305: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883117.12363: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883117.12385: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883117.12450: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883117.12487: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883117.12522: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883117.12580: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883117.12601: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883117.12663: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883117.12697: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883117.12730: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883117.12786: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883117.12807: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883117.13038: variable 'network_connections' from source: include params 28983 1726883117.13056: variable 'interface' from source: play vars 28983 1726883117.13140: variable 'interface' from source: play vars 28983 1726883117.13439: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883117.13451: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883117.13500: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883117.13544: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883117.13583: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883117.13640: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883117.13673: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883117.13710: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883117.13748: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883117.13807: variable '__network_team_connections_defined' from source: role '' defaults 28983 1726883117.14138: variable 'network_connections' from source: include params 28983 1726883117.14150: variable 'interface' from source: play vars 28983 1726883117.14225: variable 'interface' from source: play vars 28983 1726883117.14259: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 28983 1726883117.14268: when evaluation is False, skipping this task 28983 1726883117.14275: _execute() done 28983 1726883117.14284: dumping result to json 28983 1726883117.14292: done dumping result, returning 28983 1726883117.14305: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affe814-3a2d-b16d-c0a7-00000000232f] 28983 1726883117.14314: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000232f skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 28983 1726883117.14498: no more pending results, returning what we have 28983 1726883117.14502: results queue empty 28983 1726883117.14503: checking for any_errors_fatal 28983 1726883117.14518: done checking for any_errors_fatal 28983 1726883117.14519: checking for max_fail_percentage 28983 1726883117.14521: done checking for max_fail_percentage 28983 1726883117.14522: checking to see if all hosts have failed and the running result is not ok 28983 1726883117.14523: done checking to see if all hosts have failed 28983 1726883117.14524: getting the remaining hosts for this loop 28983 1726883117.14526: done getting the remaining hosts for this loop 28983 1726883117.14531: getting the next task for host managed_node2 28983 1726883117.14542: done getting next task for host managed_node2 28983 1726883117.14546: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 28983 1726883117.14553: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883117.14574: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000232f 28983 1726883117.14578: WORKER PROCESS EXITING 28983 1726883117.14849: getting variables 28983 1726883117.14851: in VariableManager get_vars() 28983 1726883117.14896: Calling all_inventory to load vars for managed_node2 28983 1726883117.14899: Calling groups_inventory to load vars for managed_node2 28983 1726883117.14902: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883117.14911: Calling all_plugins_play to load vars for managed_node2 28983 1726883117.14915: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883117.14919: Calling groups_plugins_play to load vars for managed_node2 28983 1726883117.17085: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883117.19957: done with get_vars() 28983 1726883117.19994: done getting variables 28983 1726883117.20065: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:45:17 -0400 (0:00:00.122) 0:02:27.198 ****** 28983 1726883117.20110: entering _queue_task() for managed_node2/package 28983 1726883117.20569: worker is 1 (out of 1 available) 28983 1726883117.20583: exiting _queue_task() for managed_node2/package 28983 1726883117.20597: done queuing things up, now waiting for results queue to drain 28983 1726883117.20599: waiting for pending results... 28983 1726883117.20841: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 28983 1726883117.21142: in run() - task 0affe814-3a2d-b16d-c0a7-000000002330 28983 1726883117.21146: variable 'ansible_search_path' from source: unknown 28983 1726883117.21149: variable 'ansible_search_path' from source: unknown 28983 1726883117.21153: calling self._execute() 28983 1726883117.21230: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883117.21254: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883117.21277: variable 'omit' from source: magic vars 28983 1726883117.21748: variable 'ansible_distribution_major_version' from source: facts 28983 1726883117.21768: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883117.22041: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883117.22381: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883117.22443: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883117.22493: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883117.22739: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883117.22742: variable 'network_packages' from source: role '' defaults 28983 1726883117.22880: variable '__network_provider_setup' from source: role '' defaults 28983 1726883117.22900: variable '__network_service_name_default_nm' from source: role '' defaults 28983 1726883117.22988: variable '__network_service_name_default_nm' from source: role '' defaults 28983 1726883117.23003: variable '__network_packages_default_nm' from source: role '' defaults 28983 1726883117.23090: variable '__network_packages_default_nm' from source: role '' defaults 28983 1726883117.23363: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883117.26256: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883117.26333: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883117.26383: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883117.26427: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883117.26470: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883117.26669: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883117.26673: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883117.26677: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883117.26704: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883117.26728: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883117.26796: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883117.26830: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883117.26868: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883117.26928: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883117.26953: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883117.27265: variable '__network_packages_default_gobject_packages' from source: role '' defaults 28983 1726883117.27433: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883117.27470: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883117.27507: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883117.27568: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883117.27592: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883117.27710: variable 'ansible_python' from source: facts 28983 1726883117.27735: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 28983 1726883117.27869: variable '__network_wpa_supplicant_required' from source: role '' defaults 28983 1726883117.27950: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 28983 1726883117.28125: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883117.28162: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883117.28239: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883117.28258: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883117.28279: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883117.28348: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883117.28390: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883117.28639: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883117.28643: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883117.28645: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883117.28687: variable 'network_connections' from source: include params 28983 1726883117.28698: variable 'interface' from source: play vars 28983 1726883117.28823: variable 'interface' from source: play vars 28983 1726883117.28916: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883117.28956: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883117.29002: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883117.29052: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883117.29117: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883117.29500: variable 'network_connections' from source: include params 28983 1726883117.29511: variable 'interface' from source: play vars 28983 1726883117.29642: variable 'interface' from source: play vars 28983 1726883117.29684: variable '__network_packages_default_wireless' from source: role '' defaults 28983 1726883117.29795: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883117.30125: variable 'network_connections' from source: include params 28983 1726883117.30129: variable 'interface' from source: play vars 28983 1726883117.30197: variable 'interface' from source: play vars 28983 1726883117.30214: variable '__network_packages_default_team' from source: role '' defaults 28983 1726883117.30282: variable '__network_team_connections_defined' from source: role '' defaults 28983 1726883117.30535: variable 'network_connections' from source: include params 28983 1726883117.30539: variable 'interface' from source: play vars 28983 1726883117.30593: variable 'interface' from source: play vars 28983 1726883117.30640: variable '__network_service_name_default_initscripts' from source: role '' defaults 28983 1726883117.30690: variable '__network_service_name_default_initscripts' from source: role '' defaults 28983 1726883117.30695: variable '__network_packages_default_initscripts' from source: role '' defaults 28983 1726883117.30751: variable '__network_packages_default_initscripts' from source: role '' defaults 28983 1726883117.30932: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 28983 1726883117.31335: variable 'network_connections' from source: include params 28983 1726883117.31339: variable 'interface' from source: play vars 28983 1726883117.31393: variable 'interface' from source: play vars 28983 1726883117.31401: variable 'ansible_distribution' from source: facts 28983 1726883117.31405: variable '__network_rh_distros' from source: role '' defaults 28983 1726883117.31412: variable 'ansible_distribution_major_version' from source: facts 28983 1726883117.31424: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 28983 1726883117.31565: variable 'ansible_distribution' from source: facts 28983 1726883117.31569: variable '__network_rh_distros' from source: role '' defaults 28983 1726883117.31575: variable 'ansible_distribution_major_version' from source: facts 28983 1726883117.31582: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 28983 1726883117.31732: variable 'ansible_distribution' from source: facts 28983 1726883117.31745: variable '__network_rh_distros' from source: role '' defaults 28983 1726883117.31747: variable 'ansible_distribution_major_version' from source: facts 28983 1726883117.31775: variable 'network_provider' from source: set_fact 28983 1726883117.31787: variable 'ansible_facts' from source: unknown 28983 1726883117.32827: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 28983 1726883117.32831: when evaluation is False, skipping this task 28983 1726883117.32835: _execute() done 28983 1726883117.32863: dumping result to json 28983 1726883117.32866: done dumping result, returning 28983 1726883117.32869: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [0affe814-3a2d-b16d-c0a7-000000002330] 28983 1726883117.32874: sending task result for task 0affe814-3a2d-b16d-c0a7-000000002330 28983 1726883117.32966: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000002330 28983 1726883117.32969: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 28983 1726883117.33049: no more pending results, returning what we have 28983 1726883117.33054: results queue empty 28983 1726883117.33055: checking for any_errors_fatal 28983 1726883117.33062: done checking for any_errors_fatal 28983 1726883117.33063: checking for max_fail_percentage 28983 1726883117.33065: done checking for max_fail_percentage 28983 1726883117.33066: checking to see if all hosts have failed and the running result is not ok 28983 1726883117.33067: done checking to see if all hosts have failed 28983 1726883117.33068: getting the remaining hosts for this loop 28983 1726883117.33073: done getting the remaining hosts for this loop 28983 1726883117.33078: getting the next task for host managed_node2 28983 1726883117.33087: done getting next task for host managed_node2 28983 1726883117.33092: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 28983 1726883117.33098: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883117.33125: getting variables 28983 1726883117.33126: in VariableManager get_vars() 28983 1726883117.33181: Calling all_inventory to load vars for managed_node2 28983 1726883117.33185: Calling groups_inventory to load vars for managed_node2 28983 1726883117.33188: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883117.33198: Calling all_plugins_play to load vars for managed_node2 28983 1726883117.33201: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883117.33205: Calling groups_plugins_play to load vars for managed_node2 28983 1726883117.34731: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883117.37116: done with get_vars() 28983 1726883117.37157: done getting variables 28983 1726883117.37214: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:45:17 -0400 (0:00:00.171) 0:02:27.370 ****** 28983 1726883117.37247: entering _queue_task() for managed_node2/package 28983 1726883117.37529: worker is 1 (out of 1 available) 28983 1726883117.37546: exiting _queue_task() for managed_node2/package 28983 1726883117.37560: done queuing things up, now waiting for results queue to drain 28983 1726883117.37562: waiting for pending results... 28983 1726883117.37771: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 28983 1726883117.37912: in run() - task 0affe814-3a2d-b16d-c0a7-000000002331 28983 1726883117.37928: variable 'ansible_search_path' from source: unknown 28983 1726883117.37932: variable 'ansible_search_path' from source: unknown 28983 1726883117.37965: calling self._execute() 28983 1726883117.38059: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883117.38067: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883117.38081: variable 'omit' from source: magic vars 28983 1726883117.38421: variable 'ansible_distribution_major_version' from source: facts 28983 1726883117.38432: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883117.38549: variable 'network_state' from source: role '' defaults 28983 1726883117.38565: Evaluated conditional (network_state != {}): False 28983 1726883117.38569: when evaluation is False, skipping this task 28983 1726883117.38572: _execute() done 28983 1726883117.38575: dumping result to json 28983 1726883117.38577: done dumping result, returning 28983 1726883117.38586: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affe814-3a2d-b16d-c0a7-000000002331] 28983 1726883117.38592: sending task result for task 0affe814-3a2d-b16d-c0a7-000000002331 28983 1726883117.38694: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000002331 28983 1726883117.38697: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28983 1726883117.38759: no more pending results, returning what we have 28983 1726883117.38763: results queue empty 28983 1726883117.38764: checking for any_errors_fatal 28983 1726883117.38770: done checking for any_errors_fatal 28983 1726883117.38771: checking for max_fail_percentage 28983 1726883117.38773: done checking for max_fail_percentage 28983 1726883117.38774: checking to see if all hosts have failed and the running result is not ok 28983 1726883117.38775: done checking to see if all hosts have failed 28983 1726883117.38776: getting the remaining hosts for this loop 28983 1726883117.38779: done getting the remaining hosts for this loop 28983 1726883117.38784: getting the next task for host managed_node2 28983 1726883117.38793: done getting next task for host managed_node2 28983 1726883117.38798: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 28983 1726883117.38805: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883117.38828: getting variables 28983 1726883117.38830: in VariableManager get_vars() 28983 1726883117.38886: Calling all_inventory to load vars for managed_node2 28983 1726883117.38889: Calling groups_inventory to load vars for managed_node2 28983 1726883117.38891: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883117.38900: Calling all_plugins_play to load vars for managed_node2 28983 1726883117.38904: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883117.38907: Calling groups_plugins_play to load vars for managed_node2 28983 1726883117.41069: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883117.43445: done with get_vars() 28983 1726883117.43468: done getting variables 28983 1726883117.43517: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:45:17 -0400 (0:00:00.062) 0:02:27.433 ****** 28983 1726883117.43549: entering _queue_task() for managed_node2/package 28983 1726883117.43786: worker is 1 (out of 1 available) 28983 1726883117.43800: exiting _queue_task() for managed_node2/package 28983 1726883117.43813: done queuing things up, now waiting for results queue to drain 28983 1726883117.43816: waiting for pending results... 28983 1726883117.44020: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 28983 1726883117.44158: in run() - task 0affe814-3a2d-b16d-c0a7-000000002332 28983 1726883117.44167: variable 'ansible_search_path' from source: unknown 28983 1726883117.44173: variable 'ansible_search_path' from source: unknown 28983 1726883117.44204: calling self._execute() 28983 1726883117.44293: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883117.44300: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883117.44311: variable 'omit' from source: magic vars 28983 1726883117.44940: variable 'ansible_distribution_major_version' from source: facts 28983 1726883117.44944: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883117.44947: variable 'network_state' from source: role '' defaults 28983 1726883117.44951: Evaluated conditional (network_state != {}): False 28983 1726883117.44954: when evaluation is False, skipping this task 28983 1726883117.44956: _execute() done 28983 1726883117.44963: dumping result to json 28983 1726883117.44973: done dumping result, returning 28983 1726883117.44985: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affe814-3a2d-b16d-c0a7-000000002332] 28983 1726883117.44997: sending task result for task 0affe814-3a2d-b16d-c0a7-000000002332 skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28983 1726883117.45168: no more pending results, returning what we have 28983 1726883117.45172: results queue empty 28983 1726883117.45173: checking for any_errors_fatal 28983 1726883117.45182: done checking for any_errors_fatal 28983 1726883117.45183: checking for max_fail_percentage 28983 1726883117.45185: done checking for max_fail_percentage 28983 1726883117.45186: checking to see if all hosts have failed and the running result is not ok 28983 1726883117.45187: done checking to see if all hosts have failed 28983 1726883117.45188: getting the remaining hosts for this loop 28983 1726883117.45191: done getting the remaining hosts for this loop 28983 1726883117.45197: getting the next task for host managed_node2 28983 1726883117.45208: done getting next task for host managed_node2 28983 1726883117.45213: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 28983 1726883117.45222: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883117.45258: getting variables 28983 1726883117.45261: in VariableManager get_vars() 28983 1726883117.45318: Calling all_inventory to load vars for managed_node2 28983 1726883117.45322: Calling groups_inventory to load vars for managed_node2 28983 1726883117.45325: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883117.45532: Calling all_plugins_play to load vars for managed_node2 28983 1726883117.45545: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883117.45551: Calling groups_plugins_play to load vars for managed_node2 28983 1726883117.46076: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000002332 28983 1726883117.46081: WORKER PROCESS EXITING 28983 1726883117.46837: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883117.48449: done with get_vars() 28983 1726883117.48472: done getting variables 28983 1726883117.48519: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:45:17 -0400 (0:00:00.049) 0:02:27.483 ****** 28983 1726883117.48551: entering _queue_task() for managed_node2/service 28983 1726883117.48764: worker is 1 (out of 1 available) 28983 1726883117.48778: exiting _queue_task() for managed_node2/service 28983 1726883117.48791: done queuing things up, now waiting for results queue to drain 28983 1726883117.48793: waiting for pending results... 28983 1726883117.48998: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 28983 1726883117.49121: in run() - task 0affe814-3a2d-b16d-c0a7-000000002333 28983 1726883117.49136: variable 'ansible_search_path' from source: unknown 28983 1726883117.49140: variable 'ansible_search_path' from source: unknown 28983 1726883117.49174: calling self._execute() 28983 1726883117.49267: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883117.49275: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883117.49287: variable 'omit' from source: magic vars 28983 1726883117.49614: variable 'ansible_distribution_major_version' from source: facts 28983 1726883117.49624: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883117.49739: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883117.49919: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883117.51709: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883117.52081: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883117.52115: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883117.52148: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883117.52171: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883117.52242: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883117.52266: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883117.52292: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883117.52327: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883117.52343: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883117.52386: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883117.52408: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883117.52430: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883117.52463: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883117.52478: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883117.52515: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883117.52539: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883117.52561: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883117.52594: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883117.52606: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883117.52746: variable 'network_connections' from source: include params 28983 1726883117.52757: variable 'interface' from source: play vars 28983 1726883117.52812: variable 'interface' from source: play vars 28983 1726883117.52878: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883117.53009: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883117.53042: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883117.53072: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883117.53100: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883117.53147: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883117.53166: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883117.53194: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883117.53216: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883117.53258: variable '__network_team_connections_defined' from source: role '' defaults 28983 1726883117.53461: variable 'network_connections' from source: include params 28983 1726883117.53465: variable 'interface' from source: play vars 28983 1726883117.53521: variable 'interface' from source: play vars 28983 1726883117.53544: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 28983 1726883117.53548: when evaluation is False, skipping this task 28983 1726883117.53551: _execute() done 28983 1726883117.53554: dumping result to json 28983 1726883117.53558: done dumping result, returning 28983 1726883117.53565: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affe814-3a2d-b16d-c0a7-000000002333] 28983 1726883117.53572: sending task result for task 0affe814-3a2d-b16d-c0a7-000000002333 skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 28983 1726883117.53730: no more pending results, returning what we have 28983 1726883117.53736: results queue empty 28983 1726883117.53737: checking for any_errors_fatal 28983 1726883117.53747: done checking for any_errors_fatal 28983 1726883117.53748: checking for max_fail_percentage 28983 1726883117.53750: done checking for max_fail_percentage 28983 1726883117.53751: checking to see if all hosts have failed and the running result is not ok 28983 1726883117.53752: done checking to see if all hosts have failed 28983 1726883117.53753: getting the remaining hosts for this loop 28983 1726883117.53755: done getting the remaining hosts for this loop 28983 1726883117.53760: getting the next task for host managed_node2 28983 1726883117.53769: done getting next task for host managed_node2 28983 1726883117.53773: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 28983 1726883117.53779: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883117.53804: getting variables 28983 1726883117.53806: in VariableManager get_vars() 28983 1726883117.53859: Calling all_inventory to load vars for managed_node2 28983 1726883117.53862: Calling groups_inventory to load vars for managed_node2 28983 1726883117.53865: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883117.53872: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000002333 28983 1726883117.53875: WORKER PROCESS EXITING 28983 1726883117.53882: Calling all_plugins_play to load vars for managed_node2 28983 1726883117.53886: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883117.53889: Calling groups_plugins_play to load vars for managed_node2 28983 1726883117.55289: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883117.56866: done with get_vars() 28983 1726883117.56892: done getting variables 28983 1726883117.56939: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:45:17 -0400 (0:00:00.084) 0:02:27.567 ****** 28983 1726883117.56966: entering _queue_task() for managed_node2/service 28983 1726883117.57197: worker is 1 (out of 1 available) 28983 1726883117.57212: exiting _queue_task() for managed_node2/service 28983 1726883117.57226: done queuing things up, now waiting for results queue to drain 28983 1726883117.57228: waiting for pending results... 28983 1726883117.57431: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 28983 1726883117.57563: in run() - task 0affe814-3a2d-b16d-c0a7-000000002334 28983 1726883117.57580: variable 'ansible_search_path' from source: unknown 28983 1726883117.57585: variable 'ansible_search_path' from source: unknown 28983 1726883117.57616: calling self._execute() 28983 1726883117.57710: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883117.57715: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883117.57725: variable 'omit' from source: magic vars 28983 1726883117.58440: variable 'ansible_distribution_major_version' from source: facts 28983 1726883117.58443: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883117.58548: variable 'network_provider' from source: set_fact 28983 1726883117.58563: variable 'network_state' from source: role '' defaults 28983 1726883117.58586: Evaluated conditional (network_provider == "nm" or network_state != {}): True 28983 1726883117.58600: variable 'omit' from source: magic vars 28983 1726883117.58698: variable 'omit' from source: magic vars 28983 1726883117.58741: variable 'network_service_name' from source: role '' defaults 28983 1726883117.58828: variable 'network_service_name' from source: role '' defaults 28983 1726883117.58971: variable '__network_provider_setup' from source: role '' defaults 28983 1726883117.58985: variable '__network_service_name_default_nm' from source: role '' defaults 28983 1726883117.59074: variable '__network_service_name_default_nm' from source: role '' defaults 28983 1726883117.59090: variable '__network_packages_default_nm' from source: role '' defaults 28983 1726883117.59174: variable '__network_packages_default_nm' from source: role '' defaults 28983 1726883117.59481: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883117.62194: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883117.62305: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883117.62366: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883117.62418: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883117.62487: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883117.62566: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883117.62591: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883117.62612: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883117.62651: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883117.62666: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883117.62705: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883117.62724: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883117.62746: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883117.62783: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883117.62796: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883117.62983: variable '__network_packages_default_gobject_packages' from source: role '' defaults 28983 1726883117.63077: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883117.63102: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883117.63121: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883117.63153: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883117.63166: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883117.63243: variable 'ansible_python' from source: facts 28983 1726883117.63259: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 28983 1726883117.63327: variable '__network_wpa_supplicant_required' from source: role '' defaults 28983 1726883117.63391: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 28983 1726883117.63498: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883117.63521: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883117.63545: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883117.63578: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883117.63590: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883117.63630: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883117.63663: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883117.63685: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883117.63715: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883117.63727: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883117.63848: variable 'network_connections' from source: include params 28983 1726883117.63852: variable 'interface' from source: play vars 28983 1726883117.63917: variable 'interface' from source: play vars 28983 1726883117.64005: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883117.64156: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883117.64201: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883117.64237: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883117.64273: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883117.64324: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883117.64351: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883117.64378: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883117.64407: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883117.64449: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883117.64678: variable 'network_connections' from source: include params 28983 1726883117.64684: variable 'interface' from source: play vars 28983 1726883117.64748: variable 'interface' from source: play vars 28983 1726883117.64776: variable '__network_packages_default_wireless' from source: role '' defaults 28983 1726883117.64843: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883117.65080: variable 'network_connections' from source: include params 28983 1726883117.65083: variable 'interface' from source: play vars 28983 1726883117.65141: variable 'interface' from source: play vars 28983 1726883117.65161: variable '__network_packages_default_team' from source: role '' defaults 28983 1726883117.65232: variable '__network_team_connections_defined' from source: role '' defaults 28983 1726883117.65476: variable 'network_connections' from source: include params 28983 1726883117.65479: variable 'interface' from source: play vars 28983 1726883117.65540: variable 'interface' from source: play vars 28983 1726883117.65584: variable '__network_service_name_default_initscripts' from source: role '' defaults 28983 1726883117.65638: variable '__network_service_name_default_initscripts' from source: role '' defaults 28983 1726883117.65645: variable '__network_packages_default_initscripts' from source: role '' defaults 28983 1726883117.65696: variable '__network_packages_default_initscripts' from source: role '' defaults 28983 1726883117.65881: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 28983 1726883117.66295: variable 'network_connections' from source: include params 28983 1726883117.66298: variable 'interface' from source: play vars 28983 1726883117.66350: variable 'interface' from source: play vars 28983 1726883117.66357: variable 'ansible_distribution' from source: facts 28983 1726883117.66361: variable '__network_rh_distros' from source: role '' defaults 28983 1726883117.66374: variable 'ansible_distribution_major_version' from source: facts 28983 1726883117.66384: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 28983 1726883117.66526: variable 'ansible_distribution' from source: facts 28983 1726883117.66529: variable '__network_rh_distros' from source: role '' defaults 28983 1726883117.66537: variable 'ansible_distribution_major_version' from source: facts 28983 1726883117.66544: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 28983 1726883117.66685: variable 'ansible_distribution' from source: facts 28983 1726883117.66689: variable '__network_rh_distros' from source: role '' defaults 28983 1726883117.66695: variable 'ansible_distribution_major_version' from source: facts 28983 1726883117.66726: variable 'network_provider' from source: set_fact 28983 1726883117.66747: variable 'omit' from source: magic vars 28983 1726883117.66773: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883117.66795: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883117.66812: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883117.66830: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883117.66841: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883117.66868: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883117.66874: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883117.66877: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883117.66956: Set connection var ansible_connection to ssh 28983 1726883117.66966: Set connection var ansible_shell_executable to /bin/sh 28983 1726883117.66976: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883117.66984: Set connection var ansible_timeout to 10 28983 1726883117.66991: Set connection var ansible_pipelining to False 28983 1726883117.66993: Set connection var ansible_shell_type to sh 28983 1726883117.67013: variable 'ansible_shell_executable' from source: unknown 28983 1726883117.67016: variable 'ansible_connection' from source: unknown 28983 1726883117.67022: variable 'ansible_module_compression' from source: unknown 28983 1726883117.67025: variable 'ansible_shell_type' from source: unknown 28983 1726883117.67027: variable 'ansible_shell_executable' from source: unknown 28983 1726883117.67038: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883117.67041: variable 'ansible_pipelining' from source: unknown 28983 1726883117.67045: variable 'ansible_timeout' from source: unknown 28983 1726883117.67048: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883117.67125: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883117.67142: variable 'omit' from source: magic vars 28983 1726883117.67150: starting attempt loop 28983 1726883117.67153: running the handler 28983 1726883117.67213: variable 'ansible_facts' from source: unknown 28983 1726883117.68051: _low_level_execute_command(): starting 28983 1726883117.68057: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726883117.68580: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883117.68584: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883117.68587: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726883117.68589: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883117.68648: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883117.68651: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883117.68657: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883117.68732: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883117.70505: stdout chunk (state=3): >>>/root <<< 28983 1726883117.70617: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883117.70673: stderr chunk (state=3): >>><<< 28983 1726883117.70679: stdout chunk (state=3): >>><<< 28983 1726883117.70700: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883117.70712: _low_level_execute_command(): starting 28983 1726883117.70718: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726883117.707013-34298-174428544428974 `" && echo ansible-tmp-1726883117.707013-34298-174428544428974="` echo /root/.ansible/tmp/ansible-tmp-1726883117.707013-34298-174428544428974 `" ) && sleep 0' 28983 1726883117.71186: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883117.71189: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883117.71192: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration <<< 28983 1726883117.71195: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found <<< 28983 1726883117.71197: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883117.71247: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883117.71250: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883117.71330: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883117.73346: stdout chunk (state=3): >>>ansible-tmp-1726883117.707013-34298-174428544428974=/root/.ansible/tmp/ansible-tmp-1726883117.707013-34298-174428544428974 <<< 28983 1726883117.73460: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883117.73505: stderr chunk (state=3): >>><<< 28983 1726883117.73509: stdout chunk (state=3): >>><<< 28983 1726883117.73522: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726883117.707013-34298-174428544428974=/root/.ansible/tmp/ansible-tmp-1726883117.707013-34298-174428544428974 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883117.73552: variable 'ansible_module_compression' from source: unknown 28983 1726883117.73598: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 28983 1726883117.73648: variable 'ansible_facts' from source: unknown 28983 1726883117.73793: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726883117.707013-34298-174428544428974/AnsiballZ_systemd.py 28983 1726883117.73904: Sending initial data 28983 1726883117.73908: Sent initial data (155 bytes) 28983 1726883117.74362: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883117.74366: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883117.74371: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883117.74374: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883117.74427: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883117.74437: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883117.74504: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883117.76156: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 <<< 28983 1726883117.76159: stderr chunk (state=3): >>>debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726883117.76224: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726883117.76292: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpt7ntx_vn /root/.ansible/tmp/ansible-tmp-1726883117.707013-34298-174428544428974/AnsiballZ_systemd.py <<< 28983 1726883117.76301: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726883117.707013-34298-174428544428974/AnsiballZ_systemd.py" <<< 28983 1726883117.76361: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpt7ntx_vn" to remote "/root/.ansible/tmp/ansible-tmp-1726883117.707013-34298-174428544428974/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726883117.707013-34298-174428544428974/AnsiballZ_systemd.py" <<< 28983 1726883117.78215: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883117.78301: stderr chunk (state=3): >>><<< 28983 1726883117.78305: stdout chunk (state=3): >>><<< 28983 1726883117.78312: done transferring module to remote 28983 1726883117.78322: _low_level_execute_command(): starting 28983 1726883117.78327: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726883117.707013-34298-174428544428974/ /root/.ansible/tmp/ansible-tmp-1726883117.707013-34298-174428544428974/AnsiballZ_systemd.py && sleep 0' 28983 1726883117.78743: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883117.78747: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883117.78769: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883117.78775: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883117.78815: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883117.78818: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883117.78892: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883117.81039: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883117.81043: stdout chunk (state=3): >>><<< 28983 1726883117.81046: stderr chunk (state=3): >>><<< 28983 1726883117.81048: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883117.81051: _low_level_execute_command(): starting 28983 1726883117.81053: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726883117.707013-34298-174428544428974/AnsiballZ_systemd.py && sleep 0' 28983 1726883117.81478: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883117.81493: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883117.81504: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883117.81557: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883117.81574: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883117.81652: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883118.14418: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3307", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ExecMainStartTimestampMonotonic": "237115079", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3307", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start<<< 28983 1726883118.14466: stdout chunk (state=3): >>>_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4837", "MemoryCurrent": "4538368", "MemoryAvailable": "infinity", "CPUUsageNSec": "1732566000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "<<< 28983 1726883118.14488: stdout chunk (state=3): >>>CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target NetworkManager-wait-online.service cloud-init.service network.target network.service multi-user.target", "After": "systemd-journald.socket dbus-broker.service dbus.socket system.slice cloud-init-local.service sysinit.target network-pre.target basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:40:53 EDT", "StateChangeTimestampMonotonic": "816797668", "InactiveExitTimestamp": "Fri 2024-09-20 21:31:13 EDT", "InactiveExitTimestampMonotonic": "237115438", "ActiveEnterTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ActiveEnterTimestampMonotonic": "237210316", "ActiveExitTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ActiveExitTimestampMonotonic": "237082173", "InactiveEnterTimestamp": "Fri 2024-09-20 21:31:13 EDT", "InactiveEnterTimestampMonotonic": "237109124", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ConditionTimestampMonotonic": "237109810", "AssertTimestamp": "Fri 2024-09-20 21:31:13 EDT", "AssertTimestampMonotonic": "237109813", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ac5f6302898c463683c4bdf580ab7e3e", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 28983 1726883118.16415: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. <<< 28983 1726883118.16483: stderr chunk (state=3): >>><<< 28983 1726883118.16486: stdout chunk (state=3): >>><<< 28983 1726883118.16504: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3307", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ExecMainStartTimestampMonotonic": "237115079", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3307", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4837", "MemoryCurrent": "4538368", "MemoryAvailable": "infinity", "CPUUsageNSec": "1732566000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target NetworkManager-wait-online.service cloud-init.service network.target network.service multi-user.target", "After": "systemd-journald.socket dbus-broker.service dbus.socket system.slice cloud-init-local.service sysinit.target network-pre.target basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:40:53 EDT", "StateChangeTimestampMonotonic": "816797668", "InactiveExitTimestamp": "Fri 2024-09-20 21:31:13 EDT", "InactiveExitTimestampMonotonic": "237115438", "ActiveEnterTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ActiveEnterTimestampMonotonic": "237210316", "ActiveExitTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ActiveExitTimestampMonotonic": "237082173", "InactiveEnterTimestamp": "Fri 2024-09-20 21:31:13 EDT", "InactiveEnterTimestampMonotonic": "237109124", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ConditionTimestampMonotonic": "237109810", "AssertTimestamp": "Fri 2024-09-20 21:31:13 EDT", "AssertTimestampMonotonic": "237109813", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ac5f6302898c463683c4bdf580ab7e3e", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726883118.16685: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726883117.707013-34298-174428544428974/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726883118.16704: _low_level_execute_command(): starting 28983 1726883118.16708: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726883117.707013-34298-174428544428974/ > /dev/null 2>&1 && sleep 0' 28983 1726883118.17187: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883118.17192: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726883118.17195: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address <<< 28983 1726883118.17197: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883118.17200: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883118.17250: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883118.17257: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883118.17329: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883118.19275: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883118.19319: stderr chunk (state=3): >>><<< 28983 1726883118.19322: stdout chunk (state=3): >>><<< 28983 1726883118.19338: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883118.19346: handler run complete 28983 1726883118.19401: attempt loop complete, returning result 28983 1726883118.19405: _execute() done 28983 1726883118.19407: dumping result to json 28983 1726883118.19422: done dumping result, returning 28983 1726883118.19433: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affe814-3a2d-b16d-c0a7-000000002334] 28983 1726883118.19439: sending task result for task 0affe814-3a2d-b16d-c0a7-000000002334 28983 1726883118.19726: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000002334 28983 1726883118.19729: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28983 1726883118.19797: no more pending results, returning what we have 28983 1726883118.19800: results queue empty 28983 1726883118.19801: checking for any_errors_fatal 28983 1726883118.19808: done checking for any_errors_fatal 28983 1726883118.19809: checking for max_fail_percentage 28983 1726883118.19811: done checking for max_fail_percentage 28983 1726883118.19812: checking to see if all hosts have failed and the running result is not ok 28983 1726883118.19813: done checking to see if all hosts have failed 28983 1726883118.19814: getting the remaining hosts for this loop 28983 1726883118.19816: done getting the remaining hosts for this loop 28983 1726883118.19821: getting the next task for host managed_node2 28983 1726883118.19828: done getting next task for host managed_node2 28983 1726883118.19832: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 28983 1726883118.19840: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883118.19853: getting variables 28983 1726883118.19855: in VariableManager get_vars() 28983 1726883118.19901: Calling all_inventory to load vars for managed_node2 28983 1726883118.19904: Calling groups_inventory to load vars for managed_node2 28983 1726883118.19906: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883118.19916: Calling all_plugins_play to load vars for managed_node2 28983 1726883118.19919: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883118.19922: Calling groups_plugins_play to load vars for managed_node2 28983 1726883118.21364: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883118.22965: done with get_vars() 28983 1726883118.22992: done getting variables 28983 1726883118.23043: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:45:18 -0400 (0:00:00.661) 0:02:28.228 ****** 28983 1726883118.23077: entering _queue_task() for managed_node2/service 28983 1726883118.23338: worker is 1 (out of 1 available) 28983 1726883118.23354: exiting _queue_task() for managed_node2/service 28983 1726883118.23368: done queuing things up, now waiting for results queue to drain 28983 1726883118.23370: waiting for pending results... 28983 1726883118.23586: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 28983 1726883118.23722: in run() - task 0affe814-3a2d-b16d-c0a7-000000002335 28983 1726883118.23737: variable 'ansible_search_path' from source: unknown 28983 1726883118.23741: variable 'ansible_search_path' from source: unknown 28983 1726883118.23776: calling self._execute() 28983 1726883118.23869: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883118.23878: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883118.23890: variable 'omit' from source: magic vars 28983 1726883118.24232: variable 'ansible_distribution_major_version' from source: facts 28983 1726883118.24246: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883118.24349: variable 'network_provider' from source: set_fact 28983 1726883118.24357: Evaluated conditional (network_provider == "nm"): True 28983 1726883118.24442: variable '__network_wpa_supplicant_required' from source: role '' defaults 28983 1726883118.24519: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 28983 1726883118.24679: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883118.26417: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883118.26472: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883118.26507: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883118.26541: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883118.26564: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883118.26646: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883118.26671: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883118.26694: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883118.26726: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883118.26744: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883118.26788: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883118.26808: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883118.26828: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883118.26868: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883118.26883: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883118.26918: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883118.26939: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883118.26964: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883118.26997: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883118.27009: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883118.27143: variable 'network_connections' from source: include params 28983 1726883118.27152: variable 'interface' from source: play vars 28983 1726883118.27210: variable 'interface' from source: play vars 28983 1726883118.27275: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883118.27408: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883118.27442: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883118.27469: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883118.27496: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883118.27535: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883118.27555: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883118.27619: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883118.27622: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883118.27638: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883118.27845: variable 'network_connections' from source: include params 28983 1726883118.27850: variable 'interface' from source: play vars 28983 1726883118.27902: variable 'interface' from source: play vars 28983 1726883118.27926: Evaluated conditional (__network_wpa_supplicant_required): False 28983 1726883118.27930: when evaluation is False, skipping this task 28983 1726883118.27933: _execute() done 28983 1726883118.27938: dumping result to json 28983 1726883118.27949: done dumping result, returning 28983 1726883118.27952: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affe814-3a2d-b16d-c0a7-000000002335] 28983 1726883118.27963: sending task result for task 0affe814-3a2d-b16d-c0a7-000000002335 28983 1726883118.28059: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000002335 28983 1726883118.28062: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 28983 1726883118.28125: no more pending results, returning what we have 28983 1726883118.28129: results queue empty 28983 1726883118.28130: checking for any_errors_fatal 28983 1726883118.28164: done checking for any_errors_fatal 28983 1726883118.28165: checking for max_fail_percentage 28983 1726883118.28167: done checking for max_fail_percentage 28983 1726883118.28168: checking to see if all hosts have failed and the running result is not ok 28983 1726883118.28169: done checking to see if all hosts have failed 28983 1726883118.28174: getting the remaining hosts for this loop 28983 1726883118.28176: done getting the remaining hosts for this loop 28983 1726883118.28181: getting the next task for host managed_node2 28983 1726883118.28190: done getting next task for host managed_node2 28983 1726883118.28195: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 28983 1726883118.28201: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883118.28224: getting variables 28983 1726883118.28226: in VariableManager get_vars() 28983 1726883118.28283: Calling all_inventory to load vars for managed_node2 28983 1726883118.28286: Calling groups_inventory to load vars for managed_node2 28983 1726883118.28289: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883118.28298: Calling all_plugins_play to load vars for managed_node2 28983 1726883118.28301: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883118.28304: Calling groups_plugins_play to load vars for managed_node2 28983 1726883118.29613: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883118.31246: done with get_vars() 28983 1726883118.31273: done getting variables 28983 1726883118.31325: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:45:18 -0400 (0:00:00.082) 0:02:28.311 ****** 28983 1726883118.31357: entering _queue_task() for managed_node2/service 28983 1726883118.31643: worker is 1 (out of 1 available) 28983 1726883118.31658: exiting _queue_task() for managed_node2/service 28983 1726883118.31675: done queuing things up, now waiting for results queue to drain 28983 1726883118.31677: waiting for pending results... 28983 1726883118.31882: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 28983 1726883118.31998: in run() - task 0affe814-3a2d-b16d-c0a7-000000002336 28983 1726883118.32012: variable 'ansible_search_path' from source: unknown 28983 1726883118.32017: variable 'ansible_search_path' from source: unknown 28983 1726883118.32052: calling self._execute() 28983 1726883118.32145: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883118.32148: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883118.32161: variable 'omit' from source: magic vars 28983 1726883118.32497: variable 'ansible_distribution_major_version' from source: facts 28983 1726883118.32508: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883118.32610: variable 'network_provider' from source: set_fact 28983 1726883118.32617: Evaluated conditional (network_provider == "initscripts"): False 28983 1726883118.32620: when evaluation is False, skipping this task 28983 1726883118.32624: _execute() done 28983 1726883118.32628: dumping result to json 28983 1726883118.32633: done dumping result, returning 28983 1726883118.32643: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [0affe814-3a2d-b16d-c0a7-000000002336] 28983 1726883118.32649: sending task result for task 0affe814-3a2d-b16d-c0a7-000000002336 28983 1726883118.32753: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000002336 28983 1726883118.32757: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28983 1726883118.32831: no more pending results, returning what we have 28983 1726883118.32837: results queue empty 28983 1726883118.32838: checking for any_errors_fatal 28983 1726883118.32844: done checking for any_errors_fatal 28983 1726883118.32845: checking for max_fail_percentage 28983 1726883118.32847: done checking for max_fail_percentage 28983 1726883118.32848: checking to see if all hosts have failed and the running result is not ok 28983 1726883118.32849: done checking to see if all hosts have failed 28983 1726883118.32850: getting the remaining hosts for this loop 28983 1726883118.32852: done getting the remaining hosts for this loop 28983 1726883118.32857: getting the next task for host managed_node2 28983 1726883118.32865: done getting next task for host managed_node2 28983 1726883118.32873: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 28983 1726883118.32880: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883118.32903: getting variables 28983 1726883118.32905: in VariableManager get_vars() 28983 1726883118.32954: Calling all_inventory to load vars for managed_node2 28983 1726883118.32957: Calling groups_inventory to load vars for managed_node2 28983 1726883118.32960: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883118.32969: Calling all_plugins_play to load vars for managed_node2 28983 1726883118.32974: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883118.32978: Calling groups_plugins_play to load vars for managed_node2 28983 1726883118.34387: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883118.35981: done with get_vars() 28983 1726883118.36004: done getting variables 28983 1726883118.36054: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:45:18 -0400 (0:00:00.047) 0:02:28.358 ****** 28983 1726883118.36085: entering _queue_task() for managed_node2/copy 28983 1726883118.36339: worker is 1 (out of 1 available) 28983 1726883118.36354: exiting _queue_task() for managed_node2/copy 28983 1726883118.36369: done queuing things up, now waiting for results queue to drain 28983 1726883118.36374: waiting for pending results... 28983 1726883118.36579: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 28983 1726883118.36765: in run() - task 0affe814-3a2d-b16d-c0a7-000000002337 28983 1726883118.36769: variable 'ansible_search_path' from source: unknown 28983 1726883118.36775: variable 'ansible_search_path' from source: unknown 28983 1726883118.36778: calling self._execute() 28983 1726883118.36873: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883118.36877: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883118.36888: variable 'omit' from source: magic vars 28983 1726883118.37226: variable 'ansible_distribution_major_version' from source: facts 28983 1726883118.37238: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883118.37339: variable 'network_provider' from source: set_fact 28983 1726883118.37347: Evaluated conditional (network_provider == "initscripts"): False 28983 1726883118.37350: when evaluation is False, skipping this task 28983 1726883118.37352: _execute() done 28983 1726883118.37358: dumping result to json 28983 1726883118.37364: done dumping result, returning 28983 1726883118.37371: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affe814-3a2d-b16d-c0a7-000000002337] 28983 1726883118.37386: sending task result for task 0affe814-3a2d-b16d-c0a7-000000002337 28983 1726883118.37489: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000002337 28983 1726883118.37492: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 28983 1726883118.37550: no more pending results, returning what we have 28983 1726883118.37554: results queue empty 28983 1726883118.37555: checking for any_errors_fatal 28983 1726883118.37560: done checking for any_errors_fatal 28983 1726883118.37561: checking for max_fail_percentage 28983 1726883118.37564: done checking for max_fail_percentage 28983 1726883118.37565: checking to see if all hosts have failed and the running result is not ok 28983 1726883118.37566: done checking to see if all hosts have failed 28983 1726883118.37567: getting the remaining hosts for this loop 28983 1726883118.37568: done getting the remaining hosts for this loop 28983 1726883118.37573: getting the next task for host managed_node2 28983 1726883118.37581: done getting next task for host managed_node2 28983 1726883118.37586: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 28983 1726883118.37591: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883118.37617: getting variables 28983 1726883118.37618: in VariableManager get_vars() 28983 1726883118.37661: Calling all_inventory to load vars for managed_node2 28983 1726883118.37664: Calling groups_inventory to load vars for managed_node2 28983 1726883118.37666: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883118.37675: Calling all_plugins_play to load vars for managed_node2 28983 1726883118.37678: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883118.37683: Calling groups_plugins_play to load vars for managed_node2 28983 1726883118.39260: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883118.41291: done with get_vars() 28983 1726883118.41315: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:45:18 -0400 (0:00:00.052) 0:02:28.411 ****** 28983 1726883118.41385: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 28983 1726883118.41618: worker is 1 (out of 1 available) 28983 1726883118.41632: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 28983 1726883118.41649: done queuing things up, now waiting for results queue to drain 28983 1726883118.41651: waiting for pending results... 28983 1726883118.41853: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 28983 1726883118.41976: in run() - task 0affe814-3a2d-b16d-c0a7-000000002338 28983 1726883118.41995: variable 'ansible_search_path' from source: unknown 28983 1726883118.41999: variable 'ansible_search_path' from source: unknown 28983 1726883118.42029: calling self._execute() 28983 1726883118.42341: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883118.42346: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883118.42348: variable 'omit' from source: magic vars 28983 1726883118.42649: variable 'ansible_distribution_major_version' from source: facts 28983 1726883118.42667: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883118.42689: variable 'omit' from source: magic vars 28983 1726883118.43012: variable 'omit' from source: magic vars 28983 1726883118.43237: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883118.46006: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883118.46105: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883118.46156: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883118.46219: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883118.46259: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883118.46368: variable 'network_provider' from source: set_fact 28983 1726883118.46547: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883118.46586: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883118.46631: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883118.46689: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883118.46719: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883118.46816: variable 'omit' from source: magic vars 28983 1726883118.46978: variable 'omit' from source: magic vars 28983 1726883118.47142: variable 'network_connections' from source: include params 28983 1726883118.47145: variable 'interface' from source: play vars 28983 1726883118.47225: variable 'interface' from source: play vars 28983 1726883118.47424: variable 'omit' from source: magic vars 28983 1726883118.47469: variable '__lsr_ansible_managed' from source: task vars 28983 1726883118.47526: variable '__lsr_ansible_managed' from source: task vars 28983 1726883118.47782: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 28983 1726883118.48088: Loaded config def from plugin (lookup/template) 28983 1726883118.48122: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 28983 1726883118.48147: File lookup term: get_ansible_managed.j2 28983 1726883118.48157: variable 'ansible_search_path' from source: unknown 28983 1726883118.48171: evaluation_path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 28983 1726883118.48237: search_path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 28983 1726883118.48242: variable 'ansible_search_path' from source: unknown 28983 1726883118.58366: variable 'ansible_managed' from source: unknown 28983 1726883118.58612: variable 'omit' from source: magic vars 28983 1726883118.58672: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883118.58676: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883118.58698: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883118.58720: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883118.58780: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883118.58784: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883118.58786: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883118.58789: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883118.58890: Set connection var ansible_connection to ssh 28983 1726883118.58899: Set connection var ansible_shell_executable to /bin/sh 28983 1726883118.58910: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883118.58922: Set connection var ansible_timeout to 10 28983 1726883118.58929: Set connection var ansible_pipelining to False 28983 1726883118.58932: Set connection var ansible_shell_type to sh 28983 1726883118.58962: variable 'ansible_shell_executable' from source: unknown 28983 1726883118.58965: variable 'ansible_connection' from source: unknown 28983 1726883118.58970: variable 'ansible_module_compression' from source: unknown 28983 1726883118.58977: variable 'ansible_shell_type' from source: unknown 28983 1726883118.58979: variable 'ansible_shell_executable' from source: unknown 28983 1726883118.58984: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883118.58995: variable 'ansible_pipelining' from source: unknown 28983 1726883118.59000: variable 'ansible_timeout' from source: unknown 28983 1726883118.59003: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883118.59155: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28983 1726883118.59168: variable 'omit' from source: magic vars 28983 1726883118.59217: starting attempt loop 28983 1726883118.59221: running the handler 28983 1726883118.59224: _low_level_execute_command(): starting 28983 1726883118.59226: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726883118.59930: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883118.59944: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883118.59957: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883118.59980: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883118.60089: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726883118.60093: stderr chunk (state=3): >>>debug2: match not found <<< 28983 1726883118.60098: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883118.60104: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883118.60118: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883118.60144: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883118.60245: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883118.62021: stdout chunk (state=3): >>>/root <<< 28983 1726883118.62194: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883118.62202: stderr chunk (state=3): >>><<< 28983 1726883118.62210: stdout chunk (state=3): >>><<< 28983 1726883118.62440: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883118.62443: _low_level_execute_command(): starting 28983 1726883118.62447: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726883118.6223605-34325-180190021777753 `" && echo ansible-tmp-1726883118.6223605-34325-180190021777753="` echo /root/.ansible/tmp/ansible-tmp-1726883118.6223605-34325-180190021777753 `" ) && sleep 0' 28983 1726883118.63655: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883118.63799: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883118.63956: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883118.63991: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883118.64093: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883118.66166: stdout chunk (state=3): >>>ansible-tmp-1726883118.6223605-34325-180190021777753=/root/.ansible/tmp/ansible-tmp-1726883118.6223605-34325-180190021777753 <<< 28983 1726883118.66552: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883118.66614: stderr chunk (state=3): >>><<< 28983 1726883118.66621: stdout chunk (state=3): >>><<< 28983 1726883118.66657: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726883118.6223605-34325-180190021777753=/root/.ansible/tmp/ansible-tmp-1726883118.6223605-34325-180190021777753 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883118.66710: variable 'ansible_module_compression' from source: unknown 28983 1726883118.66878: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 28983 1726883118.66919: variable 'ansible_facts' from source: unknown 28983 1726883118.67286: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726883118.6223605-34325-180190021777753/AnsiballZ_network_connections.py 28983 1726883118.67549: Sending initial data 28983 1726883118.67552: Sent initial data (168 bytes) 28983 1726883118.68808: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883118.68816: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883118.68921: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883118.70591: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 28983 1726883118.70607: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726883118.70664: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726883118.70771: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmplrzsyab7 /root/.ansible/tmp/ansible-tmp-1726883118.6223605-34325-180190021777753/AnsiballZ_network_connections.py <<< 28983 1726883118.70785: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726883118.6223605-34325-180190021777753/AnsiballZ_network_connections.py" <<< 28983 1726883118.70839: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmplrzsyab7" to remote "/root/.ansible/tmp/ansible-tmp-1726883118.6223605-34325-180190021777753/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726883118.6223605-34325-180190021777753/AnsiballZ_network_connections.py" <<< 28983 1726883118.72669: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883118.72787: stderr chunk (state=3): >>><<< 28983 1726883118.72791: stdout chunk (state=3): >>><<< 28983 1726883118.72795: done transferring module to remote 28983 1726883118.72797: _low_level_execute_command(): starting 28983 1726883118.72800: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726883118.6223605-34325-180190021777753/ /root/.ansible/tmp/ansible-tmp-1726883118.6223605-34325-180190021777753/AnsiballZ_network_connections.py && sleep 0' 28983 1726883118.73454: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883118.73480: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883118.73562: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883118.73631: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883118.73637: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883118.73662: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883118.73769: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883118.75750: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883118.75805: stderr chunk (state=3): >>><<< 28983 1726883118.75815: stdout chunk (state=3): >>><<< 28983 1726883118.75839: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883118.75857: _low_level_execute_command(): starting 28983 1726883118.75873: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726883118.6223605-34325-180190021777753/AnsiballZ_network_connections.py && sleep 0' 28983 1726883118.76654: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 28983 1726883118.76669: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883118.76687: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883118.76798: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883119.05128: stdout chunk (state=3): >>> {"changed": false, "warnings": [], "stderr": "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, e447ed98-bcde-4ff6-b521-e956422e5e9a skipped because already active\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 28983 1726883119.07101: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. <<< 28983 1726883119.07118: stdout chunk (state=3): >>><<< 28983 1726883119.07168: stderr chunk (state=3): >>><<< 28983 1726883119.07175: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "warnings": [], "stderr": "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, e447ed98-bcde-4ff6-b521-e956422e5e9a skipped because already active\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726883119.07200: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'state': 'up'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726883118.6223605-34325-180190021777753/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726883119.07212: _low_level_execute_command(): starting 28983 1726883119.07215: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726883118.6223605-34325-180190021777753/ > /dev/null 2>&1 && sleep 0' 28983 1726883119.07676: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883119.07681: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883119.07684: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883119.07733: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883119.07743: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883119.07810: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883119.09763: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883119.09808: stderr chunk (state=3): >>><<< 28983 1726883119.09811: stdout chunk (state=3): >>><<< 28983 1726883119.09829: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883119.09832: handler run complete 28983 1726883119.09864: attempt loop complete, returning result 28983 1726883119.09868: _execute() done 28983 1726883119.09870: dumping result to json 28983 1726883119.09879: done dumping result, returning 28983 1726883119.09888: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affe814-3a2d-b16d-c0a7-000000002338] 28983 1726883119.09893: sending task result for task 0affe814-3a2d-b16d-c0a7-000000002338 28983 1726883119.10007: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000002338 28983 1726883119.10010: WORKER PROCESS EXITING ok: [managed_node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "state": "up" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": false } STDERR: [002] #0, state:up persistent_state:present, 'statebr': up connection statebr, e447ed98-bcde-4ff6-b521-e956422e5e9a skipped because already active 28983 1726883119.10144: no more pending results, returning what we have 28983 1726883119.10148: results queue empty 28983 1726883119.10149: checking for any_errors_fatal 28983 1726883119.10155: done checking for any_errors_fatal 28983 1726883119.10156: checking for max_fail_percentage 28983 1726883119.10159: done checking for max_fail_percentage 28983 1726883119.10160: checking to see if all hosts have failed and the running result is not ok 28983 1726883119.10161: done checking to see if all hosts have failed 28983 1726883119.10162: getting the remaining hosts for this loop 28983 1726883119.10164: done getting the remaining hosts for this loop 28983 1726883119.10169: getting the next task for host managed_node2 28983 1726883119.10182: done getting next task for host managed_node2 28983 1726883119.10187: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 28983 1726883119.10192: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883119.10207: getting variables 28983 1726883119.10208: in VariableManager get_vars() 28983 1726883119.10268: Calling all_inventory to load vars for managed_node2 28983 1726883119.10272: Calling groups_inventory to load vars for managed_node2 28983 1726883119.10275: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883119.10284: Calling all_plugins_play to load vars for managed_node2 28983 1726883119.10288: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883119.10292: Calling groups_plugins_play to load vars for managed_node2 28983 1726883119.11609: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883119.13212: done with get_vars() 28983 1726883119.13238: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:45:19 -0400 (0:00:00.719) 0:02:29.130 ****** 28983 1726883119.13316: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 28983 1726883119.13573: worker is 1 (out of 1 available) 28983 1726883119.13588: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 28983 1726883119.13601: done queuing things up, now waiting for results queue to drain 28983 1726883119.13603: waiting for pending results... 28983 1726883119.13822: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 28983 1726883119.13954: in run() - task 0affe814-3a2d-b16d-c0a7-000000002339 28983 1726883119.13971: variable 'ansible_search_path' from source: unknown 28983 1726883119.13974: variable 'ansible_search_path' from source: unknown 28983 1726883119.14009: calling self._execute() 28983 1726883119.14103: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883119.14109: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883119.14120: variable 'omit' from source: magic vars 28983 1726883119.14462: variable 'ansible_distribution_major_version' from source: facts 28983 1726883119.14473: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883119.14583: variable 'network_state' from source: role '' defaults 28983 1726883119.14595: Evaluated conditional (network_state != {}): False 28983 1726883119.14598: when evaluation is False, skipping this task 28983 1726883119.14601: _execute() done 28983 1726883119.14606: dumping result to json 28983 1726883119.14617: done dumping result, returning 28983 1726883119.14621: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [0affe814-3a2d-b16d-c0a7-000000002339] 28983 1726883119.14626: sending task result for task 0affe814-3a2d-b16d-c0a7-000000002339 28983 1726883119.14724: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000002339 28983 1726883119.14727: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28983 1726883119.14787: no more pending results, returning what we have 28983 1726883119.14791: results queue empty 28983 1726883119.14792: checking for any_errors_fatal 28983 1726883119.14802: done checking for any_errors_fatal 28983 1726883119.14803: checking for max_fail_percentage 28983 1726883119.14805: done checking for max_fail_percentage 28983 1726883119.14807: checking to see if all hosts have failed and the running result is not ok 28983 1726883119.14808: done checking to see if all hosts have failed 28983 1726883119.14808: getting the remaining hosts for this loop 28983 1726883119.14811: done getting the remaining hosts for this loop 28983 1726883119.14816: getting the next task for host managed_node2 28983 1726883119.14823: done getting next task for host managed_node2 28983 1726883119.14828: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 28983 1726883119.14836: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883119.14860: getting variables 28983 1726883119.14861: in VariableManager get_vars() 28983 1726883119.14903: Calling all_inventory to load vars for managed_node2 28983 1726883119.14906: Calling groups_inventory to load vars for managed_node2 28983 1726883119.14909: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883119.14918: Calling all_plugins_play to load vars for managed_node2 28983 1726883119.14921: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883119.14925: Calling groups_plugins_play to load vars for managed_node2 28983 1726883119.16344: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883119.17928: done with get_vars() 28983 1726883119.17952: done getting variables 28983 1726883119.18004: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:45:19 -0400 (0:00:00.047) 0:02:29.178 ****** 28983 1726883119.18031: entering _queue_task() for managed_node2/debug 28983 1726883119.18277: worker is 1 (out of 1 available) 28983 1726883119.18296: exiting _queue_task() for managed_node2/debug 28983 1726883119.18309: done queuing things up, now waiting for results queue to drain 28983 1726883119.18311: waiting for pending results... 28983 1726883119.18517: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 28983 1726883119.18648: in run() - task 0affe814-3a2d-b16d-c0a7-00000000233a 28983 1726883119.18665: variable 'ansible_search_path' from source: unknown 28983 1726883119.18670: variable 'ansible_search_path' from source: unknown 28983 1726883119.18699: calling self._execute() 28983 1726883119.18789: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883119.18795: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883119.18806: variable 'omit' from source: magic vars 28983 1726883119.19146: variable 'ansible_distribution_major_version' from source: facts 28983 1726883119.19157: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883119.19163: variable 'omit' from source: magic vars 28983 1726883119.19229: variable 'omit' from source: magic vars 28983 1726883119.19260: variable 'omit' from source: magic vars 28983 1726883119.19300: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883119.19335: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883119.19353: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883119.19369: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883119.19382: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883119.19409: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883119.19415: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883119.19418: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883119.19501: Set connection var ansible_connection to ssh 28983 1726883119.19511: Set connection var ansible_shell_executable to /bin/sh 28983 1726883119.19520: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883119.19532: Set connection var ansible_timeout to 10 28983 1726883119.19535: Set connection var ansible_pipelining to False 28983 1726883119.19545: Set connection var ansible_shell_type to sh 28983 1726883119.19563: variable 'ansible_shell_executable' from source: unknown 28983 1726883119.19566: variable 'ansible_connection' from source: unknown 28983 1726883119.19569: variable 'ansible_module_compression' from source: unknown 28983 1726883119.19576: variable 'ansible_shell_type' from source: unknown 28983 1726883119.19579: variable 'ansible_shell_executable' from source: unknown 28983 1726883119.19582: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883119.19588: variable 'ansible_pipelining' from source: unknown 28983 1726883119.19590: variable 'ansible_timeout' from source: unknown 28983 1726883119.19596: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883119.19717: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883119.19728: variable 'omit' from source: magic vars 28983 1726883119.19734: starting attempt loop 28983 1726883119.19739: running the handler 28983 1726883119.19847: variable '__network_connections_result' from source: set_fact 28983 1726883119.19897: handler run complete 28983 1726883119.19912: attempt loop complete, returning result 28983 1726883119.19915: _execute() done 28983 1726883119.19918: dumping result to json 28983 1726883119.19924: done dumping result, returning 28983 1726883119.19932: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affe814-3a2d-b16d-c0a7-00000000233a] 28983 1726883119.19940: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000233a 28983 1726883119.20029: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000233a 28983 1726883119.20033: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, e447ed98-bcde-4ff6-b521-e956422e5e9a skipped because already active" ] } 28983 1726883119.20112: no more pending results, returning what we have 28983 1726883119.20116: results queue empty 28983 1726883119.20117: checking for any_errors_fatal 28983 1726883119.20122: done checking for any_errors_fatal 28983 1726883119.20123: checking for max_fail_percentage 28983 1726883119.20125: done checking for max_fail_percentage 28983 1726883119.20126: checking to see if all hosts have failed and the running result is not ok 28983 1726883119.20127: done checking to see if all hosts have failed 28983 1726883119.20128: getting the remaining hosts for this loop 28983 1726883119.20130: done getting the remaining hosts for this loop 28983 1726883119.20142: getting the next task for host managed_node2 28983 1726883119.20152: done getting next task for host managed_node2 28983 1726883119.20156: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 28983 1726883119.20162: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883119.20175: getting variables 28983 1726883119.20177: in VariableManager get_vars() 28983 1726883119.20216: Calling all_inventory to load vars for managed_node2 28983 1726883119.20220: Calling groups_inventory to load vars for managed_node2 28983 1726883119.20222: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883119.20231: Calling all_plugins_play to load vars for managed_node2 28983 1726883119.20236: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883119.20240: Calling groups_plugins_play to load vars for managed_node2 28983 1726883119.21591: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883119.23205: done with get_vars() 28983 1726883119.23228: done getting variables 28983 1726883119.23277: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:45:19 -0400 (0:00:00.052) 0:02:29.230 ****** 28983 1726883119.23312: entering _queue_task() for managed_node2/debug 28983 1726883119.23538: worker is 1 (out of 1 available) 28983 1726883119.23552: exiting _queue_task() for managed_node2/debug 28983 1726883119.23566: done queuing things up, now waiting for results queue to drain 28983 1726883119.23568: waiting for pending results... 28983 1726883119.23773: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 28983 1726883119.23884: in run() - task 0affe814-3a2d-b16d-c0a7-00000000233b 28983 1726883119.23901: variable 'ansible_search_path' from source: unknown 28983 1726883119.23905: variable 'ansible_search_path' from source: unknown 28983 1726883119.23939: calling self._execute() 28983 1726883119.24033: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883119.24045: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883119.24056: variable 'omit' from source: magic vars 28983 1726883119.24401: variable 'ansible_distribution_major_version' from source: facts 28983 1726883119.24412: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883119.24419: variable 'omit' from source: magic vars 28983 1726883119.24484: variable 'omit' from source: magic vars 28983 1726883119.24513: variable 'omit' from source: magic vars 28983 1726883119.24550: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883119.24584: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883119.24603: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883119.24619: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883119.24629: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883119.24659: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883119.24662: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883119.24666: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883119.24748: Set connection var ansible_connection to ssh 28983 1726883119.24759: Set connection var ansible_shell_executable to /bin/sh 28983 1726883119.24767: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883119.24777: Set connection var ansible_timeout to 10 28983 1726883119.24789: Set connection var ansible_pipelining to False 28983 1726883119.24794: Set connection var ansible_shell_type to sh 28983 1726883119.24809: variable 'ansible_shell_executable' from source: unknown 28983 1726883119.24812: variable 'ansible_connection' from source: unknown 28983 1726883119.24815: variable 'ansible_module_compression' from source: unknown 28983 1726883119.24820: variable 'ansible_shell_type' from source: unknown 28983 1726883119.24823: variable 'ansible_shell_executable' from source: unknown 28983 1726883119.24827: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883119.24832: variable 'ansible_pipelining' from source: unknown 28983 1726883119.24837: variable 'ansible_timeout' from source: unknown 28983 1726883119.24842: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883119.24963: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883119.24975: variable 'omit' from source: magic vars 28983 1726883119.24979: starting attempt loop 28983 1726883119.24981: running the handler 28983 1726883119.25029: variable '__network_connections_result' from source: set_fact 28983 1726883119.25092: variable '__network_connections_result' from source: set_fact 28983 1726883119.25183: handler run complete 28983 1726883119.25204: attempt loop complete, returning result 28983 1726883119.25207: _execute() done 28983 1726883119.25212: dumping result to json 28983 1726883119.25219: done dumping result, returning 28983 1726883119.25226: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affe814-3a2d-b16d-c0a7-00000000233b] 28983 1726883119.25238: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000233b 28983 1726883119.25337: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000233b 28983 1726883119.25340: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "state": "up" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": false, "failed": false, "stderr": "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, e447ed98-bcde-4ff6-b521-e956422e5e9a skipped because already active\n", "stderr_lines": [ "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, e447ed98-bcde-4ff6-b521-e956422e5e9a skipped because already active" ] } } 28983 1726883119.25442: no more pending results, returning what we have 28983 1726883119.25446: results queue empty 28983 1726883119.25448: checking for any_errors_fatal 28983 1726883119.25455: done checking for any_errors_fatal 28983 1726883119.25456: checking for max_fail_percentage 28983 1726883119.25458: done checking for max_fail_percentage 28983 1726883119.25459: checking to see if all hosts have failed and the running result is not ok 28983 1726883119.25460: done checking to see if all hosts have failed 28983 1726883119.25461: getting the remaining hosts for this loop 28983 1726883119.25463: done getting the remaining hosts for this loop 28983 1726883119.25467: getting the next task for host managed_node2 28983 1726883119.25477: done getting next task for host managed_node2 28983 1726883119.25481: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 28983 1726883119.25486: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883119.25499: getting variables 28983 1726883119.25501: in VariableManager get_vars() 28983 1726883119.25548: Calling all_inventory to load vars for managed_node2 28983 1726883119.25551: Calling groups_inventory to load vars for managed_node2 28983 1726883119.25558: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883119.25567: Calling all_plugins_play to load vars for managed_node2 28983 1726883119.25570: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883119.25573: Calling groups_plugins_play to load vars for managed_node2 28983 1726883119.26816: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883119.28425: done with get_vars() 28983 1726883119.28450: done getting variables 28983 1726883119.28500: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:45:19 -0400 (0:00:00.052) 0:02:29.283 ****** 28983 1726883119.28528: entering _queue_task() for managed_node2/debug 28983 1726883119.28757: worker is 1 (out of 1 available) 28983 1726883119.28771: exiting _queue_task() for managed_node2/debug 28983 1726883119.28785: done queuing things up, now waiting for results queue to drain 28983 1726883119.28787: waiting for pending results... 28983 1726883119.28978: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 28983 1726883119.29112: in run() - task 0affe814-3a2d-b16d-c0a7-00000000233c 28983 1726883119.29130: variable 'ansible_search_path' from source: unknown 28983 1726883119.29133: variable 'ansible_search_path' from source: unknown 28983 1726883119.29165: calling self._execute() 28983 1726883119.29254: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883119.29258: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883119.29270: variable 'omit' from source: magic vars 28983 1726883119.29593: variable 'ansible_distribution_major_version' from source: facts 28983 1726883119.29604: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883119.29712: variable 'network_state' from source: role '' defaults 28983 1726883119.29722: Evaluated conditional (network_state != {}): False 28983 1726883119.29725: when evaluation is False, skipping this task 28983 1726883119.29728: _execute() done 28983 1726883119.29735: dumping result to json 28983 1726883119.29738: done dumping result, returning 28983 1726883119.29747: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affe814-3a2d-b16d-c0a7-00000000233c] 28983 1726883119.29753: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000233c 28983 1726883119.29855: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000233c 28983 1726883119.29859: WORKER PROCESS EXITING skipping: [managed_node2] => { "false_condition": "network_state != {}" } 28983 1726883119.29930: no more pending results, returning what we have 28983 1726883119.29936: results queue empty 28983 1726883119.29937: checking for any_errors_fatal 28983 1726883119.29944: done checking for any_errors_fatal 28983 1726883119.29945: checking for max_fail_percentage 28983 1726883119.29947: done checking for max_fail_percentage 28983 1726883119.29948: checking to see if all hosts have failed and the running result is not ok 28983 1726883119.29949: done checking to see if all hosts have failed 28983 1726883119.29950: getting the remaining hosts for this loop 28983 1726883119.29951: done getting the remaining hosts for this loop 28983 1726883119.29955: getting the next task for host managed_node2 28983 1726883119.29962: done getting next task for host managed_node2 28983 1726883119.29967: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 28983 1726883119.29974: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883119.29995: getting variables 28983 1726883119.29997: in VariableManager get_vars() 28983 1726883119.30038: Calling all_inventory to load vars for managed_node2 28983 1726883119.30041: Calling groups_inventory to load vars for managed_node2 28983 1726883119.30043: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883119.30050: Calling all_plugins_play to load vars for managed_node2 28983 1726883119.30052: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883119.30055: Calling groups_plugins_play to load vars for managed_node2 28983 1726883119.31416: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883119.33007: done with get_vars() 28983 1726883119.33029: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:45:19 -0400 (0:00:00.045) 0:02:29.328 ****** 28983 1726883119.33108: entering _queue_task() for managed_node2/ping 28983 1726883119.33314: worker is 1 (out of 1 available) 28983 1726883119.33327: exiting _queue_task() for managed_node2/ping 28983 1726883119.33341: done queuing things up, now waiting for results queue to drain 28983 1726883119.33343: waiting for pending results... 28983 1726883119.33545: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 28983 1726883119.33678: in run() - task 0affe814-3a2d-b16d-c0a7-00000000233d 28983 1726883119.33694: variable 'ansible_search_path' from source: unknown 28983 1726883119.33698: variable 'ansible_search_path' from source: unknown 28983 1726883119.33729: calling self._execute() 28983 1726883119.33824: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883119.33830: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883119.33843: variable 'omit' from source: magic vars 28983 1726883119.34166: variable 'ansible_distribution_major_version' from source: facts 28983 1726883119.34178: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883119.34185: variable 'omit' from source: magic vars 28983 1726883119.34248: variable 'omit' from source: magic vars 28983 1726883119.34279: variable 'omit' from source: magic vars 28983 1726883119.34316: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883119.34350: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883119.34369: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883119.34386: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883119.34396: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883119.34423: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883119.34427: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883119.34431: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883119.34515: Set connection var ansible_connection to ssh 28983 1726883119.34525: Set connection var ansible_shell_executable to /bin/sh 28983 1726883119.34536: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883119.34544: Set connection var ansible_timeout to 10 28983 1726883119.34551: Set connection var ansible_pipelining to False 28983 1726883119.34555: Set connection var ansible_shell_type to sh 28983 1726883119.34579: variable 'ansible_shell_executable' from source: unknown 28983 1726883119.34582: variable 'ansible_connection' from source: unknown 28983 1726883119.34585: variable 'ansible_module_compression' from source: unknown 28983 1726883119.34590: variable 'ansible_shell_type' from source: unknown 28983 1726883119.34593: variable 'ansible_shell_executable' from source: unknown 28983 1726883119.34597: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883119.34602: variable 'ansible_pipelining' from source: unknown 28983 1726883119.34605: variable 'ansible_timeout' from source: unknown 28983 1726883119.34611: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883119.34782: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28983 1726883119.34789: variable 'omit' from source: magic vars 28983 1726883119.34797: starting attempt loop 28983 1726883119.34800: running the handler 28983 1726883119.34812: _low_level_execute_command(): starting 28983 1726883119.34820: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726883119.35340: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883119.35376: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883119.35380: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883119.35382: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883119.35440: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883119.35443: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883119.35445: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883119.35524: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883119.37324: stdout chunk (state=3): >>>/root <<< 28983 1726883119.37438: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883119.37485: stderr chunk (state=3): >>><<< 28983 1726883119.37488: stdout chunk (state=3): >>><<< 28983 1726883119.37508: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883119.37521: _low_level_execute_command(): starting 28983 1726883119.37536: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726883119.3750885-34353-139380565613600 `" && echo ansible-tmp-1726883119.3750885-34353-139380565613600="` echo /root/.ansible/tmp/ansible-tmp-1726883119.3750885-34353-139380565613600 `" ) && sleep 0' 28983 1726883119.37970: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883119.37975: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883119.37978: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883119.37987: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883119.38038: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883119.38043: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883119.38119: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883119.40145: stdout chunk (state=3): >>>ansible-tmp-1726883119.3750885-34353-139380565613600=/root/.ansible/tmp/ansible-tmp-1726883119.3750885-34353-139380565613600 <<< 28983 1726883119.40267: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883119.40309: stderr chunk (state=3): >>><<< 28983 1726883119.40312: stdout chunk (state=3): >>><<< 28983 1726883119.40327: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726883119.3750885-34353-139380565613600=/root/.ansible/tmp/ansible-tmp-1726883119.3750885-34353-139380565613600 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883119.40365: variable 'ansible_module_compression' from source: unknown 28983 1726883119.40401: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 28983 1726883119.40429: variable 'ansible_facts' from source: unknown 28983 1726883119.40492: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726883119.3750885-34353-139380565613600/AnsiballZ_ping.py 28983 1726883119.40595: Sending initial data 28983 1726883119.40599: Sent initial data (153 bytes) 28983 1726883119.41251: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883119.41319: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883119.41339: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883119.41359: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883119.41461: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883119.43097: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 28983 1726883119.43102: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726883119.43160: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726883119.43233: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpvlvao63g /root/.ansible/tmp/ansible-tmp-1726883119.3750885-34353-139380565613600/AnsiballZ_ping.py <<< 28983 1726883119.43238: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726883119.3750885-34353-139380565613600/AnsiballZ_ping.py" <<< 28983 1726883119.43298: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpvlvao63g" to remote "/root/.ansible/tmp/ansible-tmp-1726883119.3750885-34353-139380565613600/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726883119.3750885-34353-139380565613600/AnsiballZ_ping.py" <<< 28983 1726883119.44642: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883119.44646: stderr chunk (state=3): >>><<< 28983 1726883119.44648: stdout chunk (state=3): >>><<< 28983 1726883119.44814: done transferring module to remote 28983 1726883119.44817: _low_level_execute_command(): starting 28983 1726883119.44820: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726883119.3750885-34353-139380565613600/ /root/.ansible/tmp/ansible-tmp-1726883119.3750885-34353-139380565613600/AnsiballZ_ping.py && sleep 0' 28983 1726883119.45453: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883119.45457: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883119.45552: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883119.45576: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883119.45650: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883119.47619: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883119.47631: stderr chunk (state=3): >>><<< 28983 1726883119.47643: stdout chunk (state=3): >>><<< 28983 1726883119.47663: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883119.47675: _low_level_execute_command(): starting 28983 1726883119.47759: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726883119.3750885-34353-139380565613600/AnsiballZ_ping.py && sleep 0' 28983 1726883119.48303: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883119.48317: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883119.48336: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883119.48358: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883119.48426: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883119.48481: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883119.48498: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883119.48529: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883119.48639: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883119.65770: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 28983 1726883119.67263: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. <<< 28983 1726883119.67341: stderr chunk (state=3): >>><<< 28983 1726883119.67344: stdout chunk (state=3): >>><<< 28983 1726883119.67386: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726883119.67518: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726883119.3750885-34353-139380565613600/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726883119.67521: _low_level_execute_command(): starting 28983 1726883119.67524: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726883119.3750885-34353-139380565613600/ > /dev/null 2>&1 && sleep 0' 28983 1726883119.68100: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883119.68118: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883119.68137: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883119.68161: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883119.68202: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883119.68224: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883119.68318: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883119.68358: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883119.68488: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883119.70541: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883119.70732: stderr chunk (state=3): >>><<< 28983 1726883119.70739: stdout chunk (state=3): >>><<< 28983 1726883119.70742: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883119.70749: handler run complete 28983 1726883119.70752: attempt loop complete, returning result 28983 1726883119.70754: _execute() done 28983 1726883119.70756: dumping result to json 28983 1726883119.70758: done dumping result, returning 28983 1726883119.70760: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affe814-3a2d-b16d-c0a7-00000000233d] 28983 1726883119.70762: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000233d 28983 1726883119.71163: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000233d 28983 1726883119.71167: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "ping": "pong" } 28983 1726883119.71259: no more pending results, returning what we have 28983 1726883119.71264: results queue empty 28983 1726883119.71265: checking for any_errors_fatal 28983 1726883119.71275: done checking for any_errors_fatal 28983 1726883119.71313: checking for max_fail_percentage 28983 1726883119.71317: done checking for max_fail_percentage 28983 1726883119.71318: checking to see if all hosts have failed and the running result is not ok 28983 1726883119.71319: done checking to see if all hosts have failed 28983 1726883119.71320: getting the remaining hosts for this loop 28983 1726883119.71322: done getting the remaining hosts for this loop 28983 1726883119.71327: getting the next task for host managed_node2 28983 1726883119.71401: done getting next task for host managed_node2 28983 1726883119.71404: ^ task is: TASK: meta (role_complete) 28983 1726883119.71411: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883119.71427: getting variables 28983 1726883119.71429: in VariableManager get_vars() 28983 1726883119.71551: Calling all_inventory to load vars for managed_node2 28983 1726883119.71555: Calling groups_inventory to load vars for managed_node2 28983 1726883119.71558: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883119.71573: Calling all_plugins_play to load vars for managed_node2 28983 1726883119.71577: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883119.71581: Calling groups_plugins_play to load vars for managed_node2 28983 1726883119.74378: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883119.76424: done with get_vars() 28983 1726883119.76455: done getting variables 28983 1726883119.76525: done queuing things up, now waiting for results queue to drain 28983 1726883119.76526: results queue empty 28983 1726883119.76527: checking for any_errors_fatal 28983 1726883119.76529: done checking for any_errors_fatal 28983 1726883119.76530: checking for max_fail_percentage 28983 1726883119.76531: done checking for max_fail_percentage 28983 1726883119.76531: checking to see if all hosts have failed and the running result is not ok 28983 1726883119.76532: done checking to see if all hosts have failed 28983 1726883119.76532: getting the remaining hosts for this loop 28983 1726883119.76533: done getting the remaining hosts for this loop 28983 1726883119.76538: getting the next task for host managed_node2 28983 1726883119.76543: done getting next task for host managed_node2 28983 1726883119.76546: ^ task is: TASK: Include network role 28983 1726883119.76547: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883119.76550: getting variables 28983 1726883119.76551: in VariableManager get_vars() 28983 1726883119.76562: Calling all_inventory to load vars for managed_node2 28983 1726883119.76564: Calling groups_inventory to load vars for managed_node2 28983 1726883119.76566: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883119.76570: Calling all_plugins_play to load vars for managed_node2 28983 1726883119.76574: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883119.76576: Calling groups_plugins_play to load vars for managed_node2 28983 1726883119.77993: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883119.80983: done with get_vars() 28983 1726883119.81023: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml:3 Friday 20 September 2024 21:45:19 -0400 (0:00:00.480) 0:02:29.809 ****** 28983 1726883119.81125: entering _queue_task() for managed_node2/include_role 28983 1726883119.81526: worker is 1 (out of 1 available) 28983 1726883119.81543: exiting _queue_task() for managed_node2/include_role 28983 1726883119.81557: done queuing things up, now waiting for results queue to drain 28983 1726883119.81559: waiting for pending results... 28983 1726883119.81862: running TaskExecutor() for managed_node2/TASK: Include network role 28983 1726883119.81974: in run() - task 0affe814-3a2d-b16d-c0a7-000000002142 28983 1726883119.82040: variable 'ansible_search_path' from source: unknown 28983 1726883119.82045: variable 'ansible_search_path' from source: unknown 28983 1726883119.82058: calling self._execute() 28983 1726883119.82189: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883119.82204: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883119.82225: variable 'omit' from source: magic vars 28983 1726883119.82740: variable 'ansible_distribution_major_version' from source: facts 28983 1726883119.82745: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883119.82749: _execute() done 28983 1726883119.82753: dumping result to json 28983 1726883119.82757: done dumping result, returning 28983 1726883119.82771: done running TaskExecutor() for managed_node2/TASK: Include network role [0affe814-3a2d-b16d-c0a7-000000002142] 28983 1726883119.82785: sending task result for task 0affe814-3a2d-b16d-c0a7-000000002142 28983 1726883119.83173: no more pending results, returning what we have 28983 1726883119.83178: in VariableManager get_vars() 28983 1726883119.83229: Calling all_inventory to load vars for managed_node2 28983 1726883119.83233: Calling groups_inventory to load vars for managed_node2 28983 1726883119.83239: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883119.83260: Calling all_plugins_play to load vars for managed_node2 28983 1726883119.83265: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883119.83269: Calling groups_plugins_play to load vars for managed_node2 28983 1726883119.83851: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000002142 28983 1726883119.83855: WORKER PROCESS EXITING 28983 1726883119.92259: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883119.95209: done with get_vars() 28983 1726883119.95232: variable 'ansible_search_path' from source: unknown 28983 1726883119.95236: variable 'ansible_search_path' from source: unknown 28983 1726883119.95360: variable 'omit' from source: magic vars 28983 1726883119.95396: variable 'omit' from source: magic vars 28983 1726883119.95408: variable 'omit' from source: magic vars 28983 1726883119.95410: we have included files to process 28983 1726883119.95411: generating all_blocks data 28983 1726883119.95413: done generating all_blocks data 28983 1726883119.95416: processing included file: fedora.linux_system_roles.network 28983 1726883119.95432: in VariableManager get_vars() 28983 1726883119.95446: done with get_vars() 28983 1726883119.95466: in VariableManager get_vars() 28983 1726883119.95481: done with get_vars() 28983 1726883119.95509: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 28983 1726883119.95602: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 28983 1726883119.95667: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 28983 1726883119.96040: in VariableManager get_vars() 28983 1726883119.96059: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 28983 1726883119.98127: iterating over new_blocks loaded from include file 28983 1726883119.98129: in VariableManager get_vars() 28983 1726883119.98151: done with get_vars() 28983 1726883119.98153: filtering new block on tags 28983 1726883119.98559: done filtering new block on tags 28983 1726883119.98563: in VariableManager get_vars() 28983 1726883119.98586: done with get_vars() 28983 1726883119.98588: filtering new block on tags 28983 1726883119.98609: done filtering new block on tags 28983 1726883119.98611: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node2 28983 1726883119.98617: extending task lists for all hosts with included blocks 28983 1726883119.98764: done extending task lists 28983 1726883119.98765: done processing included files 28983 1726883119.98766: results queue empty 28983 1726883119.98767: checking for any_errors_fatal 28983 1726883119.98769: done checking for any_errors_fatal 28983 1726883119.98772: checking for max_fail_percentage 28983 1726883119.98774: done checking for max_fail_percentage 28983 1726883119.98775: checking to see if all hosts have failed and the running result is not ok 28983 1726883119.98776: done checking to see if all hosts have failed 28983 1726883119.98777: getting the remaining hosts for this loop 28983 1726883119.98778: done getting the remaining hosts for this loop 28983 1726883119.98781: getting the next task for host managed_node2 28983 1726883119.98786: done getting next task for host managed_node2 28983 1726883119.98789: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 28983 1726883119.98792: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883119.98805: getting variables 28983 1726883119.98807: in VariableManager get_vars() 28983 1726883119.98824: Calling all_inventory to load vars for managed_node2 28983 1726883119.98827: Calling groups_inventory to load vars for managed_node2 28983 1726883119.98829: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883119.98837: Calling all_plugins_play to load vars for managed_node2 28983 1726883119.98841: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883119.98845: Calling groups_plugins_play to load vars for managed_node2 28983 1726883120.00131: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883120.02274: done with get_vars() 28983 1726883120.02303: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:45:20 -0400 (0:00:00.212) 0:02:30.021 ****** 28983 1726883120.02373: entering _queue_task() for managed_node2/include_tasks 28983 1726883120.02655: worker is 1 (out of 1 available) 28983 1726883120.02670: exiting _queue_task() for managed_node2/include_tasks 28983 1726883120.02685: done queuing things up, now waiting for results queue to drain 28983 1726883120.02687: waiting for pending results... 28983 1726883120.02894: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 28983 1726883120.03021: in run() - task 0affe814-3a2d-b16d-c0a7-0000000024a4 28983 1726883120.03035: variable 'ansible_search_path' from source: unknown 28983 1726883120.03041: variable 'ansible_search_path' from source: unknown 28983 1726883120.03074: calling self._execute() 28983 1726883120.03163: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883120.03170: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883120.03183: variable 'omit' from source: magic vars 28983 1726883120.03517: variable 'ansible_distribution_major_version' from source: facts 28983 1726883120.03527: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883120.03533: _execute() done 28983 1726883120.03539: dumping result to json 28983 1726883120.03544: done dumping result, returning 28983 1726883120.03551: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affe814-3a2d-b16d-c0a7-0000000024a4] 28983 1726883120.03557: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000024a4 28983 1726883120.03648: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000024a4 28983 1726883120.03651: WORKER PROCESS EXITING 28983 1726883120.03743: no more pending results, returning what we have 28983 1726883120.03748: in VariableManager get_vars() 28983 1726883120.03799: Calling all_inventory to load vars for managed_node2 28983 1726883120.03803: Calling groups_inventory to load vars for managed_node2 28983 1726883120.03806: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883120.03815: Calling all_plugins_play to load vars for managed_node2 28983 1726883120.03818: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883120.03821: Calling groups_plugins_play to load vars for managed_node2 28983 1726883120.05059: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883120.06775: done with get_vars() 28983 1726883120.06794: variable 'ansible_search_path' from source: unknown 28983 1726883120.06795: variable 'ansible_search_path' from source: unknown 28983 1726883120.06828: we have included files to process 28983 1726883120.06829: generating all_blocks data 28983 1726883120.06830: done generating all_blocks data 28983 1726883120.06833: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 28983 1726883120.06835: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 28983 1726883120.06837: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 28983 1726883120.07304: done processing included file 28983 1726883120.07305: iterating over new_blocks loaded from include file 28983 1726883120.07306: in VariableManager get_vars() 28983 1726883120.07327: done with get_vars() 28983 1726883120.07329: filtering new block on tags 28983 1726883120.07355: done filtering new block on tags 28983 1726883120.07358: in VariableManager get_vars() 28983 1726883120.07381: done with get_vars() 28983 1726883120.07382: filtering new block on tags 28983 1726883120.07419: done filtering new block on tags 28983 1726883120.07421: in VariableManager get_vars() 28983 1726883120.07442: done with get_vars() 28983 1726883120.07443: filtering new block on tags 28983 1726883120.07482: done filtering new block on tags 28983 1726883120.07484: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node2 28983 1726883120.07488: extending task lists for all hosts with included blocks 28983 1726883120.08860: done extending task lists 28983 1726883120.08861: done processing included files 28983 1726883120.08862: results queue empty 28983 1726883120.08863: checking for any_errors_fatal 28983 1726883120.08866: done checking for any_errors_fatal 28983 1726883120.08867: checking for max_fail_percentage 28983 1726883120.08867: done checking for max_fail_percentage 28983 1726883120.08868: checking to see if all hosts have failed and the running result is not ok 28983 1726883120.08869: done checking to see if all hosts have failed 28983 1726883120.08872: getting the remaining hosts for this loop 28983 1726883120.08873: done getting the remaining hosts for this loop 28983 1726883120.08876: getting the next task for host managed_node2 28983 1726883120.08880: done getting next task for host managed_node2 28983 1726883120.08883: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 28983 1726883120.08886: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883120.08895: getting variables 28983 1726883120.08896: in VariableManager get_vars() 28983 1726883120.08907: Calling all_inventory to load vars for managed_node2 28983 1726883120.08909: Calling groups_inventory to load vars for managed_node2 28983 1726883120.08911: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883120.08915: Calling all_plugins_play to load vars for managed_node2 28983 1726883120.08916: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883120.08919: Calling groups_plugins_play to load vars for managed_node2 28983 1726883120.09990: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883120.11566: done with get_vars() 28983 1726883120.11592: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:45:20 -0400 (0:00:00.092) 0:02:30.114 ****** 28983 1726883120.11650: entering _queue_task() for managed_node2/setup 28983 1726883120.11898: worker is 1 (out of 1 available) 28983 1726883120.11911: exiting _queue_task() for managed_node2/setup 28983 1726883120.11926: done queuing things up, now waiting for results queue to drain 28983 1726883120.11928: waiting for pending results... 28983 1726883120.12119: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 28983 1726883120.12247: in run() - task 0affe814-3a2d-b16d-c0a7-0000000024fb 28983 1726883120.12261: variable 'ansible_search_path' from source: unknown 28983 1726883120.12266: variable 'ansible_search_path' from source: unknown 28983 1726883120.12298: calling self._execute() 28983 1726883120.12380: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883120.12388: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883120.12400: variable 'omit' from source: magic vars 28983 1726883120.12723: variable 'ansible_distribution_major_version' from source: facts 28983 1726883120.12736: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883120.12919: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883120.14816: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883120.14872: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883120.14909: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883120.14942: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883120.14966: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883120.15037: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883120.15061: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883120.15085: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883120.15121: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883120.15136: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883120.15186: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883120.15208: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883120.15232: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883120.15269: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883120.15285: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883120.15425: variable '__network_required_facts' from source: role '' defaults 28983 1726883120.15433: variable 'ansible_facts' from source: unknown 28983 1726883120.16122: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 28983 1726883120.16125: when evaluation is False, skipping this task 28983 1726883120.16129: _execute() done 28983 1726883120.16131: dumping result to json 28983 1726883120.16138: done dumping result, returning 28983 1726883120.16146: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affe814-3a2d-b16d-c0a7-0000000024fb] 28983 1726883120.16151: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000024fb 28983 1726883120.16242: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000024fb 28983 1726883120.16246: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28983 1726883120.16297: no more pending results, returning what we have 28983 1726883120.16301: results queue empty 28983 1726883120.16302: checking for any_errors_fatal 28983 1726883120.16304: done checking for any_errors_fatal 28983 1726883120.16304: checking for max_fail_percentage 28983 1726883120.16306: done checking for max_fail_percentage 28983 1726883120.16307: checking to see if all hosts have failed and the running result is not ok 28983 1726883120.16308: done checking to see if all hosts have failed 28983 1726883120.16309: getting the remaining hosts for this loop 28983 1726883120.16312: done getting the remaining hosts for this loop 28983 1726883120.16316: getting the next task for host managed_node2 28983 1726883120.16327: done getting next task for host managed_node2 28983 1726883120.16331: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 28983 1726883120.16340: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883120.16373: getting variables 28983 1726883120.16375: in VariableManager get_vars() 28983 1726883120.16418: Calling all_inventory to load vars for managed_node2 28983 1726883120.16421: Calling groups_inventory to load vars for managed_node2 28983 1726883120.16423: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883120.16432: Calling all_plugins_play to load vars for managed_node2 28983 1726883120.16437: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883120.16447: Calling groups_plugins_play to load vars for managed_node2 28983 1726883120.17753: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883120.19386: done with get_vars() 28983 1726883120.19411: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:45:20 -0400 (0:00:00.078) 0:02:30.192 ****** 28983 1726883120.19489: entering _queue_task() for managed_node2/stat 28983 1726883120.19714: worker is 1 (out of 1 available) 28983 1726883120.19729: exiting _queue_task() for managed_node2/stat 28983 1726883120.19744: done queuing things up, now waiting for results queue to drain 28983 1726883120.19746: waiting for pending results... 28983 1726883120.19940: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 28983 1726883120.20069: in run() - task 0affe814-3a2d-b16d-c0a7-0000000024fd 28983 1726883120.20086: variable 'ansible_search_path' from source: unknown 28983 1726883120.20091: variable 'ansible_search_path' from source: unknown 28983 1726883120.20124: calling self._execute() 28983 1726883120.20277: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883120.20341: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883120.20344: variable 'omit' from source: magic vars 28983 1726883120.20806: variable 'ansible_distribution_major_version' from source: facts 28983 1726883120.20825: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883120.21190: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883120.21392: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883120.21432: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883120.21473: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883120.21503: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883120.21577: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883120.21596: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883120.21617: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883120.21645: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883120.21727: variable '__network_is_ostree' from source: set_fact 28983 1726883120.21733: Evaluated conditional (not __network_is_ostree is defined): False 28983 1726883120.21739: when evaluation is False, skipping this task 28983 1726883120.21742: _execute() done 28983 1726883120.21746: dumping result to json 28983 1726883120.21757: done dumping result, returning 28983 1726883120.21760: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affe814-3a2d-b16d-c0a7-0000000024fd] 28983 1726883120.21763: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000024fd 28983 1726883120.21853: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000024fd skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 28983 1726883120.21919: no more pending results, returning what we have 28983 1726883120.21923: results queue empty 28983 1726883120.21924: checking for any_errors_fatal 28983 1726883120.21929: done checking for any_errors_fatal 28983 1726883120.21930: checking for max_fail_percentage 28983 1726883120.21932: done checking for max_fail_percentage 28983 1726883120.21933: checking to see if all hosts have failed and the running result is not ok 28983 1726883120.21936: done checking to see if all hosts have failed 28983 1726883120.21937: getting the remaining hosts for this loop 28983 1726883120.21939: done getting the remaining hosts for this loop 28983 1726883120.21944: getting the next task for host managed_node2 28983 1726883120.21952: done getting next task for host managed_node2 28983 1726883120.21955: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 28983 1726883120.21962: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883120.21990: getting variables 28983 1726883120.21992: in VariableManager get_vars() 28983 1726883120.22032: Calling all_inventory to load vars for managed_node2 28983 1726883120.22043: Calling groups_inventory to load vars for managed_node2 28983 1726883120.22046: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883120.22052: WORKER PROCESS EXITING 28983 1726883120.22060: Calling all_plugins_play to load vars for managed_node2 28983 1726883120.22063: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883120.22065: Calling groups_plugins_play to load vars for managed_node2 28983 1726883120.23524: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883120.26508: done with get_vars() 28983 1726883120.26544: done getting variables 28983 1726883120.26611: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:45:20 -0400 (0:00:00.071) 0:02:30.264 ****** 28983 1726883120.26656: entering _queue_task() for managed_node2/set_fact 28983 1726883120.26955: worker is 1 (out of 1 available) 28983 1726883120.26967: exiting _queue_task() for managed_node2/set_fact 28983 1726883120.26983: done queuing things up, now waiting for results queue to drain 28983 1726883120.26985: waiting for pending results... 28983 1726883120.27305: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 28983 1726883120.27490: in run() - task 0affe814-3a2d-b16d-c0a7-0000000024fe 28983 1726883120.27511: variable 'ansible_search_path' from source: unknown 28983 1726883120.27519: variable 'ansible_search_path' from source: unknown 28983 1726883120.27572: calling self._execute() 28983 1726883120.27696: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883120.27710: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883120.27728: variable 'omit' from source: magic vars 28983 1726883120.28208: variable 'ansible_distribution_major_version' from source: facts 28983 1726883120.28231: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883120.28449: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883120.28785: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883120.28845: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883120.28898: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883120.28945: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883120.29048: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883120.29090: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883120.29127: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883120.29165: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883120.29287: variable '__network_is_ostree' from source: set_fact 28983 1726883120.29410: Evaluated conditional (not __network_is_ostree is defined): False 28983 1726883120.29413: when evaluation is False, skipping this task 28983 1726883120.29416: _execute() done 28983 1726883120.29418: dumping result to json 28983 1726883120.29420: done dumping result, returning 28983 1726883120.29424: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affe814-3a2d-b16d-c0a7-0000000024fe] 28983 1726883120.29426: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000024fe 28983 1726883120.29497: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000024fe 28983 1726883120.29500: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 28983 1726883120.29555: no more pending results, returning what we have 28983 1726883120.29560: results queue empty 28983 1726883120.29561: checking for any_errors_fatal 28983 1726883120.29568: done checking for any_errors_fatal 28983 1726883120.29569: checking for max_fail_percentage 28983 1726883120.29574: done checking for max_fail_percentage 28983 1726883120.29575: checking to see if all hosts have failed and the running result is not ok 28983 1726883120.29576: done checking to see if all hosts have failed 28983 1726883120.29577: getting the remaining hosts for this loop 28983 1726883120.29580: done getting the remaining hosts for this loop 28983 1726883120.29585: getting the next task for host managed_node2 28983 1726883120.29600: done getting next task for host managed_node2 28983 1726883120.29604: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 28983 1726883120.29611: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883120.29651: getting variables 28983 1726883120.29654: in VariableManager get_vars() 28983 1726883120.29711: Calling all_inventory to load vars for managed_node2 28983 1726883120.29714: Calling groups_inventory to load vars for managed_node2 28983 1726883120.29717: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883120.29727: Calling all_plugins_play to load vars for managed_node2 28983 1726883120.29732: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883120.29941: Calling groups_plugins_play to load vars for managed_node2 28983 1726883120.32197: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883120.35265: done with get_vars() 28983 1726883120.35305: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:45:20 -0400 (0:00:00.087) 0:02:30.351 ****** 28983 1726883120.35417: entering _queue_task() for managed_node2/service_facts 28983 1726883120.35973: worker is 1 (out of 1 available) 28983 1726883120.35985: exiting _queue_task() for managed_node2/service_facts 28983 1726883120.35997: done queuing things up, now waiting for results queue to drain 28983 1726883120.35999: waiting for pending results... 28983 1726883120.36156: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running 28983 1726883120.36382: in run() - task 0affe814-3a2d-b16d-c0a7-000000002500 28983 1726883120.36445: variable 'ansible_search_path' from source: unknown 28983 1726883120.36449: variable 'ansible_search_path' from source: unknown 28983 1726883120.36453: calling self._execute() 28983 1726883120.36582: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883120.36598: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883120.36617: variable 'omit' from source: magic vars 28983 1726883120.37208: variable 'ansible_distribution_major_version' from source: facts 28983 1726883120.37233: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883120.37247: variable 'omit' from source: magic vars 28983 1726883120.37392: variable 'omit' from source: magic vars 28983 1726883120.37481: variable 'omit' from source: magic vars 28983 1726883120.37544: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883120.37594: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883120.37641: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883120.37661: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883120.37741: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883120.37744: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883120.37749: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883120.37752: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883120.37905: Set connection var ansible_connection to ssh 28983 1726883120.37940: Set connection var ansible_shell_executable to /bin/sh 28983 1726883120.37944: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883120.37960: Set connection var ansible_timeout to 10 28983 1726883120.37998: Set connection var ansible_pipelining to False 28983 1726883120.38001: Set connection var ansible_shell_type to sh 28983 1726883120.38020: variable 'ansible_shell_executable' from source: unknown 28983 1726883120.38030: variable 'ansible_connection' from source: unknown 28983 1726883120.38042: variable 'ansible_module_compression' from source: unknown 28983 1726883120.38105: variable 'ansible_shell_type' from source: unknown 28983 1726883120.38108: variable 'ansible_shell_executable' from source: unknown 28983 1726883120.38111: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883120.38113: variable 'ansible_pipelining' from source: unknown 28983 1726883120.38116: variable 'ansible_timeout' from source: unknown 28983 1726883120.38118: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883120.38340: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28983 1726883120.38359: variable 'omit' from source: magic vars 28983 1726883120.38512: starting attempt loop 28983 1726883120.38516: running the handler 28983 1726883120.38519: _low_level_execute_command(): starting 28983 1726883120.38523: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726883120.39105: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883120.39116: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726883120.39131: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883120.39149: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883120.39202: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883120.39211: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883120.39299: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883120.41088: stdout chunk (state=3): >>>/root <<< 28983 1726883120.41275: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883120.41280: stdout chunk (state=3): >>><<< 28983 1726883120.41283: stderr chunk (state=3): >>><<< 28983 1726883120.41314: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883120.41353: _low_level_execute_command(): starting 28983 1726883120.41357: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726883120.4133167-34387-37003227420069 `" && echo ansible-tmp-1726883120.4133167-34387-37003227420069="` echo /root/.ansible/tmp/ansible-tmp-1726883120.4133167-34387-37003227420069 `" ) && sleep 0' 28983 1726883120.41824: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883120.41828: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883120.41831: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883120.41842: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883120.41893: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883120.41896: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883120.41966: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883120.44005: stdout chunk (state=3): >>>ansible-tmp-1726883120.4133167-34387-37003227420069=/root/.ansible/tmp/ansible-tmp-1726883120.4133167-34387-37003227420069 <<< 28983 1726883120.44244: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883120.44247: stdout chunk (state=3): >>><<< 28983 1726883120.44250: stderr chunk (state=3): >>><<< 28983 1726883120.44253: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726883120.4133167-34387-37003227420069=/root/.ansible/tmp/ansible-tmp-1726883120.4133167-34387-37003227420069 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883120.44265: variable 'ansible_module_compression' from source: unknown 28983 1726883120.44316: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 28983 1726883120.44357: variable 'ansible_facts' from source: unknown 28983 1726883120.44458: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726883120.4133167-34387-37003227420069/AnsiballZ_service_facts.py 28983 1726883120.44750: Sending initial data 28983 1726883120.44753: Sent initial data (161 bytes) 28983 1726883120.45131: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883120.45136: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883120.45139: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration <<< 28983 1726883120.45142: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883120.45198: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883120.45204: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883120.45272: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883120.46940: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726883120.47025: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726883120.47116: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpbokbcy5y /root/.ansible/tmp/ansible-tmp-1726883120.4133167-34387-37003227420069/AnsiballZ_service_facts.py <<< 28983 1726883120.47121: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726883120.4133167-34387-37003227420069/AnsiballZ_service_facts.py" <<< 28983 1726883120.47188: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpbokbcy5y" to remote "/root/.ansible/tmp/ansible-tmp-1726883120.4133167-34387-37003227420069/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726883120.4133167-34387-37003227420069/AnsiballZ_service_facts.py" <<< 28983 1726883120.48541: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883120.48578: stderr chunk (state=3): >>><<< 28983 1726883120.48581: stdout chunk (state=3): >>><<< 28983 1726883120.48605: done transferring module to remote 28983 1726883120.48615: _low_level_execute_command(): starting 28983 1726883120.48621: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726883120.4133167-34387-37003227420069/ /root/.ansible/tmp/ansible-tmp-1726883120.4133167-34387-37003227420069/AnsiballZ_service_facts.py && sleep 0' 28983 1726883120.49228: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883120.49347: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883120.49379: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883120.49489: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883120.51368: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883120.51414: stderr chunk (state=3): >>><<< 28983 1726883120.51418: stdout chunk (state=3): >>><<< 28983 1726883120.51436: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883120.51439: _low_level_execute_command(): starting 28983 1726883120.51445: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726883120.4133167-34387-37003227420069/AnsiballZ_service_facts.py && sleep 0' 28983 1726883120.52131: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883120.52153: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883120.52324: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883122.51230: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"network": {"name": "network", "state": "running", "status": "enabled", "source": "sysv"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "import-state.service": {"name": "import-state.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "ip6tables.service": {"name": "ip6tables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "iptables.service": {"name": "iptables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service<<< 28983 1726883122.51254: stdout chunk (state=3): >>>", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "generated", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "sourc<<< 28983 1726883122.51302: stdout chunk (state=3): >>>e": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "stat<<< 28983 1726883122.51306: stdout chunk (state=3): >>>ic", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "loadmodules.service": {"name": "loadmodules.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "stati<<< 28983 1726883122.51310: stdout chunk (state=3): >>>c", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 28983 1726883122.52956: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. <<< 28983 1726883122.53022: stderr chunk (state=3): >>><<< 28983 1726883122.53025: stdout chunk (state=3): >>><<< 28983 1726883122.53060: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"network": {"name": "network", "state": "running", "status": "enabled", "source": "sysv"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "import-state.service": {"name": "import-state.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "ip6tables.service": {"name": "ip6tables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "iptables.service": {"name": "iptables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "generated", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "loadmodules.service": {"name": "loadmodules.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726883122.54096: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726883120.4133167-34387-37003227420069/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726883122.54105: _low_level_execute_command(): starting 28983 1726883122.54111: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726883120.4133167-34387-37003227420069/ > /dev/null 2>&1 && sleep 0' 28983 1726883122.54587: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883122.54590: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726883122.54593: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883122.54595: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883122.54640: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883122.54663: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883122.54730: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883122.56674: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883122.56720: stderr chunk (state=3): >>><<< 28983 1726883122.56723: stdout chunk (state=3): >>><<< 28983 1726883122.56738: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883122.56745: handler run complete 28983 1726883122.56907: variable 'ansible_facts' from source: unknown 28983 1726883122.57046: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883122.57493: variable 'ansible_facts' from source: unknown 28983 1726883122.57618: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883122.57817: attempt loop complete, returning result 28983 1726883122.57824: _execute() done 28983 1726883122.57827: dumping result to json 28983 1726883122.57879: done dumping result, returning 28983 1726883122.57887: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running [0affe814-3a2d-b16d-c0a7-000000002500] 28983 1726883122.57893: sending task result for task 0affe814-3a2d-b16d-c0a7-000000002500 28983 1726883122.58885: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000002500 28983 1726883122.58888: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28983 1726883122.58953: no more pending results, returning what we have 28983 1726883122.58955: results queue empty 28983 1726883122.58956: checking for any_errors_fatal 28983 1726883122.58959: done checking for any_errors_fatal 28983 1726883122.58960: checking for max_fail_percentage 28983 1726883122.58961: done checking for max_fail_percentage 28983 1726883122.58962: checking to see if all hosts have failed and the running result is not ok 28983 1726883122.58962: done checking to see if all hosts have failed 28983 1726883122.58963: getting the remaining hosts for this loop 28983 1726883122.58964: done getting the remaining hosts for this loop 28983 1726883122.58967: getting the next task for host managed_node2 28983 1726883122.58974: done getting next task for host managed_node2 28983 1726883122.58977: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 28983 1726883122.58983: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883122.58992: getting variables 28983 1726883122.58993: in VariableManager get_vars() 28983 1726883122.59020: Calling all_inventory to load vars for managed_node2 28983 1726883122.59022: Calling groups_inventory to load vars for managed_node2 28983 1726883122.59024: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883122.59031: Calling all_plugins_play to load vars for managed_node2 28983 1726883122.59033: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883122.59037: Calling groups_plugins_play to load vars for managed_node2 28983 1726883122.60174: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883122.61843: done with get_vars() 28983 1726883122.61866: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:45:22 -0400 (0:00:02.265) 0:02:32.617 ****** 28983 1726883122.61947: entering _queue_task() for managed_node2/package_facts 28983 1726883122.62183: worker is 1 (out of 1 available) 28983 1726883122.62197: exiting _queue_task() for managed_node2/package_facts 28983 1726883122.62211: done queuing things up, now waiting for results queue to drain 28983 1726883122.62213: waiting for pending results... 28983 1726883122.62410: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 28983 1726883122.62538: in run() - task 0affe814-3a2d-b16d-c0a7-000000002501 28983 1726883122.62554: variable 'ansible_search_path' from source: unknown 28983 1726883122.62558: variable 'ansible_search_path' from source: unknown 28983 1726883122.62590: calling self._execute() 28983 1726883122.62696: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883122.62702: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883122.62714: variable 'omit' from source: magic vars 28983 1726883122.63068: variable 'ansible_distribution_major_version' from source: facts 28983 1726883122.63080: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883122.63086: variable 'omit' from source: magic vars 28983 1726883122.63155: variable 'omit' from source: magic vars 28983 1726883122.63182: variable 'omit' from source: magic vars 28983 1726883122.63220: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883122.63251: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883122.63273: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883122.63291: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883122.63302: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883122.63331: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883122.63336: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883122.63340: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883122.63426: Set connection var ansible_connection to ssh 28983 1726883122.63436: Set connection var ansible_shell_executable to /bin/sh 28983 1726883122.63446: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883122.63454: Set connection var ansible_timeout to 10 28983 1726883122.63461: Set connection var ansible_pipelining to False 28983 1726883122.63463: Set connection var ansible_shell_type to sh 28983 1726883122.63484: variable 'ansible_shell_executable' from source: unknown 28983 1726883122.63488: variable 'ansible_connection' from source: unknown 28983 1726883122.63491: variable 'ansible_module_compression' from source: unknown 28983 1726883122.63494: variable 'ansible_shell_type' from source: unknown 28983 1726883122.63497: variable 'ansible_shell_executable' from source: unknown 28983 1726883122.63502: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883122.63507: variable 'ansible_pipelining' from source: unknown 28983 1726883122.63510: variable 'ansible_timeout' from source: unknown 28983 1726883122.63515: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883122.63726: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28983 1726883122.63918: variable 'omit' from source: magic vars 28983 1726883122.63921: starting attempt loop 28983 1726883122.63924: running the handler 28983 1726883122.63926: _low_level_execute_command(): starting 28983 1726883122.63928: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726883122.64463: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883122.64484: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883122.64552: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883122.64613: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883122.64632: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883122.64666: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883122.64772: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883122.66527: stdout chunk (state=3): >>>/root <<< 28983 1726883122.66633: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883122.66681: stderr chunk (state=3): >>><<< 28983 1726883122.66685: stdout chunk (state=3): >>><<< 28983 1726883122.66705: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883122.66719: _low_level_execute_command(): starting 28983 1726883122.66725: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726883122.667055-34449-200484286420708 `" && echo ansible-tmp-1726883122.667055-34449-200484286420708="` echo /root/.ansible/tmp/ansible-tmp-1726883122.667055-34449-200484286420708 `" ) && sleep 0' 28983 1726883122.67268: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883122.67278: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883122.67340: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883122.69355: stdout chunk (state=3): >>>ansible-tmp-1726883122.667055-34449-200484286420708=/root/.ansible/tmp/ansible-tmp-1726883122.667055-34449-200484286420708 <<< 28983 1726883122.69471: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883122.69512: stderr chunk (state=3): >>><<< 28983 1726883122.69518: stdout chunk (state=3): >>><<< 28983 1726883122.69537: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726883122.667055-34449-200484286420708=/root/.ansible/tmp/ansible-tmp-1726883122.667055-34449-200484286420708 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883122.69573: variable 'ansible_module_compression' from source: unknown 28983 1726883122.69609: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 28983 1726883122.69660: variable 'ansible_facts' from source: unknown 28983 1726883122.69797: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726883122.667055-34449-200484286420708/AnsiballZ_package_facts.py 28983 1726883122.69910: Sending initial data 28983 1726883122.69913: Sent initial data (161 bytes) 28983 1726883122.70346: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883122.70350: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726883122.70353: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration <<< 28983 1726883122.70355: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found <<< 28983 1726883122.70358: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883122.70414: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883122.70418: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883122.70485: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883122.72160: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 28983 1726883122.72164: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726883122.72231: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726883122.72294: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpir33241y /root/.ansible/tmp/ansible-tmp-1726883122.667055-34449-200484286420708/AnsiballZ_package_facts.py <<< 28983 1726883122.72297: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726883122.667055-34449-200484286420708/AnsiballZ_package_facts.py" <<< 28983 1726883122.72361: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpir33241y" to remote "/root/.ansible/tmp/ansible-tmp-1726883122.667055-34449-200484286420708/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726883122.667055-34449-200484286420708/AnsiballZ_package_facts.py" <<< 28983 1726883122.74455: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883122.74458: stdout chunk (state=3): >>><<< 28983 1726883122.74460: stderr chunk (state=3): >>><<< 28983 1726883122.74462: done transferring module to remote 28983 1726883122.74464: _low_level_execute_command(): starting 28983 1726883122.74466: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726883122.667055-34449-200484286420708/ /root/.ansible/tmp/ansible-tmp-1726883122.667055-34449-200484286420708/AnsiballZ_package_facts.py && sleep 0' 28983 1726883122.74949: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883122.74961: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883122.75051: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883122.75085: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883122.75101: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883122.75120: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883122.75211: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883122.77147: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883122.77192: stderr chunk (state=3): >>><<< 28983 1726883122.77199: stdout chunk (state=3): >>><<< 28983 1726883122.77208: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883122.77211: _low_level_execute_command(): starting 28983 1726883122.77218: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726883122.667055-34449-200484286420708/AnsiballZ_package_facts.py && sleep 0' 28983 1726883122.77630: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883122.77636: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883122.77638: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883122.77641: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883122.77696: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883122.77703: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883122.77778: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883123.41557: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "relea<<< 28983 1726883123.41747: stdout chunk (state=3): >>>se": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-<<< 28983 1726883123.41752: stdout chunk (state=3): >>>libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null<<< 28983 1726883123.41908: stdout chunk (state=3): >>>, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "9.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chkconfig": [{"name": "chkconfig", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts": [{"name": "initscripts", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "network-scripts": [{"name": "network-scripts", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 28983 1726883123.43619: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. <<< 28983 1726883123.43719: stderr chunk (state=3): >>><<< 28983 1726883123.43862: stdout chunk (state=3): >>><<< 28983 1726883123.44047: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "9.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chkconfig": [{"name": "chkconfig", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts": [{"name": "initscripts", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "network-scripts": [{"name": "network-scripts", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726883123.52387: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726883122.667055-34449-200484286420708/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726883123.52634: _low_level_execute_command(): starting 28983 1726883123.52639: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726883122.667055-34449-200484286420708/ > /dev/null 2>&1 && sleep 0' 28983 1726883123.53915: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883123.53931: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration <<< 28983 1726883123.53952: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883123.54111: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883123.54150: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883123.54169: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883123.54350: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883123.56323: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883123.56498: stderr chunk (state=3): >>><<< 28983 1726883123.56777: stdout chunk (state=3): >>><<< 28983 1726883123.56781: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883123.56784: handler run complete 28983 1726883123.59719: variable 'ansible_facts' from source: unknown 28983 1726883123.61476: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883123.69040: variable 'ansible_facts' from source: unknown 28983 1726883123.70948: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883123.72831: attempt loop complete, returning result 28983 1726883123.72856: _execute() done 28983 1726883123.72863: dumping result to json 28983 1726883123.73260: done dumping result, returning 28983 1726883123.73264: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affe814-3a2d-b16d-c0a7-000000002501] 28983 1726883123.73267: sending task result for task 0affe814-3a2d-b16d-c0a7-000000002501 28983 1726883123.77258: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000002501 28983 1726883123.77262: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28983 1726883123.77555: no more pending results, returning what we have 28983 1726883123.77559: results queue empty 28983 1726883123.77560: checking for any_errors_fatal 28983 1726883123.77567: done checking for any_errors_fatal 28983 1726883123.77568: checking for max_fail_percentage 28983 1726883123.77573: done checking for max_fail_percentage 28983 1726883123.77574: checking to see if all hosts have failed and the running result is not ok 28983 1726883123.77575: done checking to see if all hosts have failed 28983 1726883123.77576: getting the remaining hosts for this loop 28983 1726883123.77578: done getting the remaining hosts for this loop 28983 1726883123.77582: getting the next task for host managed_node2 28983 1726883123.77592: done getting next task for host managed_node2 28983 1726883123.77596: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 28983 1726883123.77602: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883123.77618: getting variables 28983 1726883123.77619: in VariableManager get_vars() 28983 1726883123.77765: Calling all_inventory to load vars for managed_node2 28983 1726883123.77769: Calling groups_inventory to load vars for managed_node2 28983 1726883123.77775: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883123.77785: Calling all_plugins_play to load vars for managed_node2 28983 1726883123.77789: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883123.77793: Calling groups_plugins_play to load vars for managed_node2 28983 1726883123.82199: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883123.86219: done with get_vars() 28983 1726883123.86364: done getting variables 28983 1726883123.86521: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:45:23 -0400 (0:00:01.246) 0:02:33.863 ****** 28983 1726883123.86618: entering _queue_task() for managed_node2/debug 28983 1726883123.87020: worker is 1 (out of 1 available) 28983 1726883123.87038: exiting _queue_task() for managed_node2/debug 28983 1726883123.87053: done queuing things up, now waiting for results queue to drain 28983 1726883123.87055: waiting for pending results... 28983 1726883123.87454: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 28983 1726883123.87586: in run() - task 0affe814-3a2d-b16d-c0a7-0000000024a5 28983 1726883123.87611: variable 'ansible_search_path' from source: unknown 28983 1726883123.87622: variable 'ansible_search_path' from source: unknown 28983 1726883123.87695: calling self._execute() 28983 1726883123.87816: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883123.87830: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883123.87853: variable 'omit' from source: magic vars 28983 1726883123.88325: variable 'ansible_distribution_major_version' from source: facts 28983 1726883123.88347: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883123.88360: variable 'omit' from source: magic vars 28983 1726883123.88449: variable 'omit' from source: magic vars 28983 1726883123.88604: variable 'network_provider' from source: set_fact 28983 1726883123.88630: variable 'omit' from source: magic vars 28983 1726883123.88830: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883123.88979: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883123.88983: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883123.89154: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883123.89157: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883123.89179: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883123.89204: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883123.89415: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883123.89551: Set connection var ansible_connection to ssh 28983 1726883123.89572: Set connection var ansible_shell_executable to /bin/sh 28983 1726883123.89588: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883123.89603: Set connection var ansible_timeout to 10 28983 1726883123.89614: Set connection var ansible_pipelining to False 28983 1726883123.89622: Set connection var ansible_shell_type to sh 28983 1726883123.89657: variable 'ansible_shell_executable' from source: unknown 28983 1726883123.89775: variable 'ansible_connection' from source: unknown 28983 1726883123.89779: variable 'ansible_module_compression' from source: unknown 28983 1726883123.89782: variable 'ansible_shell_type' from source: unknown 28983 1726883123.89785: variable 'ansible_shell_executable' from source: unknown 28983 1726883123.89787: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883123.89789: variable 'ansible_pipelining' from source: unknown 28983 1726883123.89792: variable 'ansible_timeout' from source: unknown 28983 1726883123.89802: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883123.90342: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883123.90346: variable 'omit' from source: magic vars 28983 1726883123.90349: starting attempt loop 28983 1726883123.90352: running the handler 28983 1726883123.90355: handler run complete 28983 1726883123.90454: attempt loop complete, returning result 28983 1726883123.90463: _execute() done 28983 1726883123.90472: dumping result to json 28983 1726883123.90482: done dumping result, returning 28983 1726883123.90495: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [0affe814-3a2d-b16d-c0a7-0000000024a5] 28983 1726883123.90506: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000024a5 ok: [managed_node2] => {} MSG: Using network provider: nm 28983 1726883123.90702: no more pending results, returning what we have 28983 1726883123.90707: results queue empty 28983 1726883123.90708: checking for any_errors_fatal 28983 1726883123.90719: done checking for any_errors_fatal 28983 1726883123.90721: checking for max_fail_percentage 28983 1726883123.90723: done checking for max_fail_percentage 28983 1726883123.90724: checking to see if all hosts have failed and the running result is not ok 28983 1726883123.90725: done checking to see if all hosts have failed 28983 1726883123.90726: getting the remaining hosts for this loop 28983 1726883123.90729: done getting the remaining hosts for this loop 28983 1726883123.90736: getting the next task for host managed_node2 28983 1726883123.90747: done getting next task for host managed_node2 28983 1726883123.90752: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 28983 1726883123.90759: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883123.90776: getting variables 28983 1726883123.90778: in VariableManager get_vars() 28983 1726883123.90833: Calling all_inventory to load vars for managed_node2 28983 1726883123.91041: Calling groups_inventory to load vars for managed_node2 28983 1726883123.91045: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883123.91056: Calling all_plugins_play to load vars for managed_node2 28983 1726883123.91061: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883123.91065: Calling groups_plugins_play to load vars for managed_node2 28983 1726883123.92341: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000024a5 28983 1726883123.92345: WORKER PROCESS EXITING 28983 1726883123.94121: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883123.99937: done with get_vars() 28983 1726883123.99998: done getting variables 28983 1726883124.00279: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:45:24 -0400 (0:00:00.137) 0:02:34.001 ****** 28983 1726883124.00331: entering _queue_task() for managed_node2/fail 28983 1726883124.01130: worker is 1 (out of 1 available) 28983 1726883124.01148: exiting _queue_task() for managed_node2/fail 28983 1726883124.01167: done queuing things up, now waiting for results queue to drain 28983 1726883124.01169: waiting for pending results... 28983 1726883124.01789: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 28983 1726883124.02179: in run() - task 0affe814-3a2d-b16d-c0a7-0000000024a6 28983 1726883124.02279: variable 'ansible_search_path' from source: unknown 28983 1726883124.02294: variable 'ansible_search_path' from source: unknown 28983 1726883124.02349: calling self._execute() 28983 1726883124.02595: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883124.02744: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883124.02766: variable 'omit' from source: magic vars 28983 1726883124.03796: variable 'ansible_distribution_major_version' from source: facts 28983 1726883124.03852: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883124.04236: variable 'network_state' from source: role '' defaults 28983 1726883124.04291: Evaluated conditional (network_state != {}): False 28983 1726883124.04300: when evaluation is False, skipping this task 28983 1726883124.04386: _execute() done 28983 1726883124.04397: dumping result to json 28983 1726883124.04411: done dumping result, returning 28983 1726883124.04429: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affe814-3a2d-b16d-c0a7-0000000024a6] 28983 1726883124.04444: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000024a6 skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28983 1726883124.04664: no more pending results, returning what we have 28983 1726883124.04669: results queue empty 28983 1726883124.04670: checking for any_errors_fatal 28983 1726883124.04680: done checking for any_errors_fatal 28983 1726883124.04681: checking for max_fail_percentage 28983 1726883124.04683: done checking for max_fail_percentage 28983 1726883124.04685: checking to see if all hosts have failed and the running result is not ok 28983 1726883124.04686: done checking to see if all hosts have failed 28983 1726883124.04687: getting the remaining hosts for this loop 28983 1726883124.04689: done getting the remaining hosts for this loop 28983 1726883124.04694: getting the next task for host managed_node2 28983 1726883124.04706: done getting next task for host managed_node2 28983 1726883124.04712: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 28983 1726883124.04720: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883124.04766: getting variables 28983 1726883124.04768: in VariableManager get_vars() 28983 1726883124.04824: Calling all_inventory to load vars for managed_node2 28983 1726883124.04827: Calling groups_inventory to load vars for managed_node2 28983 1726883124.04829: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883124.04946: Calling all_plugins_play to load vars for managed_node2 28983 1726883124.04951: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883124.04957: Calling groups_plugins_play to load vars for managed_node2 28983 1726883124.05741: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000024a6 28983 1726883124.05745: WORKER PROCESS EXITING 28983 1726883124.09775: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883124.15749: done with get_vars() 28983 1726883124.15797: done getting variables 28983 1726883124.15873: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:45:24 -0400 (0:00:00.155) 0:02:34.156 ****** 28983 1726883124.15918: entering _queue_task() for managed_node2/fail 28983 1726883124.16723: worker is 1 (out of 1 available) 28983 1726883124.16939: exiting _queue_task() for managed_node2/fail 28983 1726883124.16953: done queuing things up, now waiting for results queue to drain 28983 1726883124.16955: waiting for pending results... 28983 1726883124.17319: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 28983 1726883124.17693: in run() - task 0affe814-3a2d-b16d-c0a7-0000000024a7 28983 1726883124.17842: variable 'ansible_search_path' from source: unknown 28983 1726883124.17852: variable 'ansible_search_path' from source: unknown 28983 1726883124.17900: calling self._execute() 28983 1726883124.18365: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883124.18372: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883124.18376: variable 'omit' from source: magic vars 28983 1726883124.19298: variable 'ansible_distribution_major_version' from source: facts 28983 1726883124.19320: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883124.19627: variable 'network_state' from source: role '' defaults 28983 1726883124.19756: Evaluated conditional (network_state != {}): False 28983 1726883124.19765: when evaluation is False, skipping this task 28983 1726883124.19782: _execute() done 28983 1726883124.19792: dumping result to json 28983 1726883124.19801: done dumping result, returning 28983 1726883124.19813: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affe814-3a2d-b16d-c0a7-0000000024a7] 28983 1726883124.19861: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000024a7 28983 1726883124.20231: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000024a7 skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28983 1726883124.20291: no more pending results, returning what we have 28983 1726883124.20296: results queue empty 28983 1726883124.20297: checking for any_errors_fatal 28983 1726883124.20309: done checking for any_errors_fatal 28983 1726883124.20310: checking for max_fail_percentage 28983 1726883124.20312: done checking for max_fail_percentage 28983 1726883124.20313: checking to see if all hosts have failed and the running result is not ok 28983 1726883124.20314: done checking to see if all hosts have failed 28983 1726883124.20315: getting the remaining hosts for this loop 28983 1726883124.20317: done getting the remaining hosts for this loop 28983 1726883124.20322: getting the next task for host managed_node2 28983 1726883124.20333: done getting next task for host managed_node2 28983 1726883124.20339: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 28983 1726883124.20346: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883124.20388: getting variables 28983 1726883124.20390: in VariableManager get_vars() 28983 1726883124.20549: Calling all_inventory to load vars for managed_node2 28983 1726883124.20553: Calling groups_inventory to load vars for managed_node2 28983 1726883124.20556: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883124.20569: Calling all_plugins_play to load vars for managed_node2 28983 1726883124.20573: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883124.20577: Calling groups_plugins_play to load vars for managed_node2 28983 1726883124.21340: WORKER PROCESS EXITING 28983 1726883124.25091: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883124.31805: done with get_vars() 28983 1726883124.31952: done getting variables 28983 1726883124.32126: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:45:24 -0400 (0:00:00.162) 0:02:34.319 ****** 28983 1726883124.32220: entering _queue_task() for managed_node2/fail 28983 1726883124.33016: worker is 1 (out of 1 available) 28983 1726883124.33028: exiting _queue_task() for managed_node2/fail 28983 1726883124.33041: done queuing things up, now waiting for results queue to drain 28983 1726883124.33043: waiting for pending results... 28983 1726883124.33659: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 28983 1726883124.34063: in run() - task 0affe814-3a2d-b16d-c0a7-0000000024a8 28983 1726883124.34089: variable 'ansible_search_path' from source: unknown 28983 1726883124.34100: variable 'ansible_search_path' from source: unknown 28983 1726883124.34149: calling self._execute() 28983 1726883124.34268: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883124.34450: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883124.34471: variable 'omit' from source: magic vars 28983 1726883124.35325: variable 'ansible_distribution_major_version' from source: facts 28983 1726883124.35739: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883124.36139: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883124.41209: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883124.41314: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883124.41368: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883124.41584: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883124.41620: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883124.41720: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883124.41976: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883124.42015: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883124.42073: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883124.42098: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883124.42425: variable 'ansible_distribution_major_version' from source: facts 28983 1726883124.42452: Evaluated conditional (ansible_distribution_major_version | int > 9): True 28983 1726883124.42606: variable 'ansible_distribution' from source: facts 28983 1726883124.42939: variable '__network_rh_distros' from source: role '' defaults 28983 1726883124.42947: Evaluated conditional (ansible_distribution in __network_rh_distros): False 28983 1726883124.42950: when evaluation is False, skipping this task 28983 1726883124.42952: _execute() done 28983 1726883124.42954: dumping result to json 28983 1726883124.42957: done dumping result, returning 28983 1726883124.42959: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affe814-3a2d-b16d-c0a7-0000000024a8] 28983 1726883124.42962: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000024a8 28983 1726883124.43040: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000024a8 28983 1726883124.43044: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution in __network_rh_distros", "skip_reason": "Conditional result was False" } 28983 1726883124.43102: no more pending results, returning what we have 28983 1726883124.43106: results queue empty 28983 1726883124.43107: checking for any_errors_fatal 28983 1726883124.43116: done checking for any_errors_fatal 28983 1726883124.43117: checking for max_fail_percentage 28983 1726883124.43119: done checking for max_fail_percentage 28983 1726883124.43121: checking to see if all hosts have failed and the running result is not ok 28983 1726883124.43121: done checking to see if all hosts have failed 28983 1726883124.43122: getting the remaining hosts for this loop 28983 1726883124.43125: done getting the remaining hosts for this loop 28983 1726883124.43129: getting the next task for host managed_node2 28983 1726883124.43141: done getting next task for host managed_node2 28983 1726883124.43146: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 28983 1726883124.43152: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883124.43192: getting variables 28983 1726883124.43194: in VariableManager get_vars() 28983 1726883124.43364: Calling all_inventory to load vars for managed_node2 28983 1726883124.43368: Calling groups_inventory to load vars for managed_node2 28983 1726883124.43374: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883124.43384: Calling all_plugins_play to load vars for managed_node2 28983 1726883124.43389: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883124.43392: Calling groups_plugins_play to load vars for managed_node2 28983 1726883124.46533: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883124.50472: done with get_vars() 28983 1726883124.50531: done getting variables 28983 1726883124.50610: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:45:24 -0400 (0:00:00.184) 0:02:34.504 ****** 28983 1726883124.50656: entering _queue_task() for managed_node2/dnf 28983 1726883124.51053: worker is 1 (out of 1 available) 28983 1726883124.51067: exiting _queue_task() for managed_node2/dnf 28983 1726883124.51079: done queuing things up, now waiting for results queue to drain 28983 1726883124.51081: waiting for pending results... 28983 1726883124.51412: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 28983 1726883124.51600: in run() - task 0affe814-3a2d-b16d-c0a7-0000000024a9 28983 1726883124.51626: variable 'ansible_search_path' from source: unknown 28983 1726883124.51640: variable 'ansible_search_path' from source: unknown 28983 1726883124.51693: calling self._execute() 28983 1726883124.51820: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883124.51837: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883124.51855: variable 'omit' from source: magic vars 28983 1726883124.52328: variable 'ansible_distribution_major_version' from source: facts 28983 1726883124.52351: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883124.52626: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883124.56068: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883124.56163: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883124.56210: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883124.63316: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883124.63343: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883124.63407: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883124.63444: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883124.63467: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883124.63504: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883124.63519: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883124.63618: variable 'ansible_distribution' from source: facts 28983 1726883124.63622: variable 'ansible_distribution_major_version' from source: facts 28983 1726883124.63631: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 28983 1726883124.63735: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883124.63866: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883124.63886: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883124.63907: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883124.63943: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883124.63955: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883124.64002: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883124.64050: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883124.64124: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883124.64128: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883124.64131: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883124.64254: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883124.64259: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883124.64262: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883124.64279: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883124.64297: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883124.64489: variable 'network_connections' from source: include params 28983 1726883124.64502: variable 'interface' from source: play vars 28983 1726883124.64576: variable 'interface' from source: play vars 28983 1726883124.64842: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883124.64855: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883124.64901: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883124.64949: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883124.64994: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883124.65079: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883124.65087: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883124.65180: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883124.65186: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883124.65189: variable '__network_team_connections_defined' from source: role '' defaults 28983 1726883124.65727: variable 'network_connections' from source: include params 28983 1726883124.65731: variable 'interface' from source: play vars 28983 1726883124.65759: variable 'interface' from source: play vars 28983 1726883124.65789: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 28983 1726883124.65793: when evaluation is False, skipping this task 28983 1726883124.65795: _execute() done 28983 1726883124.65798: dumping result to json 28983 1726883124.65803: done dumping result, returning 28983 1726883124.65813: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affe814-3a2d-b16d-c0a7-0000000024a9] 28983 1726883124.65816: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000024a9 28983 1726883124.66280: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000024a9 28983 1726883124.66285: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 28983 1726883124.66344: no more pending results, returning what we have 28983 1726883124.66348: results queue empty 28983 1726883124.66349: checking for any_errors_fatal 28983 1726883124.66358: done checking for any_errors_fatal 28983 1726883124.66359: checking for max_fail_percentage 28983 1726883124.66361: done checking for max_fail_percentage 28983 1726883124.66363: checking to see if all hosts have failed and the running result is not ok 28983 1726883124.66364: done checking to see if all hosts have failed 28983 1726883124.66365: getting the remaining hosts for this loop 28983 1726883124.66367: done getting the remaining hosts for this loop 28983 1726883124.66380: getting the next task for host managed_node2 28983 1726883124.66398: done getting next task for host managed_node2 28983 1726883124.66403: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 28983 1726883124.66409: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883124.66514: getting variables 28983 1726883124.66516: in VariableManager get_vars() 28983 1726883124.66636: Calling all_inventory to load vars for managed_node2 28983 1726883124.66639: Calling groups_inventory to load vars for managed_node2 28983 1726883124.66642: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883124.66653: Calling all_plugins_play to load vars for managed_node2 28983 1726883124.66657: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883124.66661: Calling groups_plugins_play to load vars for managed_node2 28983 1726883124.75626: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883124.78668: done with get_vars() 28983 1726883124.78714: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 28983 1726883124.78806: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:45:24 -0400 (0:00:00.281) 0:02:34.786 ****** 28983 1726883124.78848: entering _queue_task() for managed_node2/yum 28983 1726883124.79268: worker is 1 (out of 1 available) 28983 1726883124.79283: exiting _queue_task() for managed_node2/yum 28983 1726883124.79297: done queuing things up, now waiting for results queue to drain 28983 1726883124.79299: waiting for pending results... 28983 1726883124.79717: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 28983 1726883124.79840: in run() - task 0affe814-3a2d-b16d-c0a7-0000000024aa 28983 1726883124.79859: variable 'ansible_search_path' from source: unknown 28983 1726883124.79863: variable 'ansible_search_path' from source: unknown 28983 1726883124.79929: calling self._execute() 28983 1726883124.80041: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883124.80045: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883124.80060: variable 'omit' from source: magic vars 28983 1726883124.80560: variable 'ansible_distribution_major_version' from source: facts 28983 1726883124.80579: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883124.80845: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883124.84438: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883124.84578: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883124.84689: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883124.84739: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883124.84805: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883124.85110: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883124.85152: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883124.85185: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883124.85364: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883124.85381: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883124.85694: variable 'ansible_distribution_major_version' from source: facts 28983 1726883124.85719: Evaluated conditional (ansible_distribution_major_version | int < 8): False 28983 1726883124.85723: when evaluation is False, skipping this task 28983 1726883124.85726: _execute() done 28983 1726883124.85730: dumping result to json 28983 1726883124.85741: done dumping result, returning 28983 1726883124.85884: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affe814-3a2d-b16d-c0a7-0000000024aa] 28983 1726883124.85888: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000024aa 28983 1726883124.86209: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000024aa 28983 1726883124.86212: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 28983 1726883124.86303: no more pending results, returning what we have 28983 1726883124.86308: results queue empty 28983 1726883124.86314: checking for any_errors_fatal 28983 1726883124.86327: done checking for any_errors_fatal 28983 1726883124.86328: checking for max_fail_percentage 28983 1726883124.86330: done checking for max_fail_percentage 28983 1726883124.86331: checking to see if all hosts have failed and the running result is not ok 28983 1726883124.86332: done checking to see if all hosts have failed 28983 1726883124.86335: getting the remaining hosts for this loop 28983 1726883124.86338: done getting the remaining hosts for this loop 28983 1726883124.86344: getting the next task for host managed_node2 28983 1726883124.86357: done getting next task for host managed_node2 28983 1726883124.86362: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 28983 1726883124.86369: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883124.86415: getting variables 28983 1726883124.86418: in VariableManager get_vars() 28983 1726883124.86799: Calling all_inventory to load vars for managed_node2 28983 1726883124.86803: Calling groups_inventory to load vars for managed_node2 28983 1726883124.86806: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883124.86817: Calling all_plugins_play to load vars for managed_node2 28983 1726883124.86821: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883124.86830: Calling groups_plugins_play to load vars for managed_node2 28983 1726883124.89262: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883124.92538: done with get_vars() 28983 1726883124.92580: done getting variables 28983 1726883124.92662: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:45:24 -0400 (0:00:00.138) 0:02:34.924 ****** 28983 1726883124.92708: entering _queue_task() for managed_node2/fail 28983 1726883124.93184: worker is 1 (out of 1 available) 28983 1726883124.93196: exiting _queue_task() for managed_node2/fail 28983 1726883124.93208: done queuing things up, now waiting for results queue to drain 28983 1726883124.93210: waiting for pending results... 28983 1726883124.93504: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 28983 1726883124.93647: in run() - task 0affe814-3a2d-b16d-c0a7-0000000024ab 28983 1726883124.93673: variable 'ansible_search_path' from source: unknown 28983 1726883124.93677: variable 'ansible_search_path' from source: unknown 28983 1726883124.93746: calling self._execute() 28983 1726883124.93853: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883124.93862: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883124.93881: variable 'omit' from source: magic vars 28983 1726883124.94375: variable 'ansible_distribution_major_version' from source: facts 28983 1726883124.94385: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883124.94568: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883124.94872: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883124.97069: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883124.97130: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883124.97167: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883124.97207: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883124.97243: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883124.97316: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883124.97342: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883124.97363: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883124.97403: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883124.97432: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883124.97474: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883124.97497: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883124.97519: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883124.97553: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883124.97565: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883124.97601: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883124.97624: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883124.97681: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883124.97711: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883124.97724: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883124.97885: variable 'network_connections' from source: include params 28983 1726883124.97896: variable 'interface' from source: play vars 28983 1726883124.97953: variable 'interface' from source: play vars 28983 1726883124.98014: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883124.98181: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883124.98219: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883124.98247: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883124.98280: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883124.98317: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883124.98352: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883124.98382: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883124.98404: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883124.98446: variable '__network_team_connections_defined' from source: role '' defaults 28983 1726883124.98692: variable 'network_connections' from source: include params 28983 1726883124.98700: variable 'interface' from source: play vars 28983 1726883124.98778: variable 'interface' from source: play vars 28983 1726883124.98796: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 28983 1726883124.98799: when evaluation is False, skipping this task 28983 1726883124.98802: _execute() done 28983 1726883124.98806: dumping result to json 28983 1726883124.98811: done dumping result, returning 28983 1726883124.98818: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affe814-3a2d-b16d-c0a7-0000000024ab] 28983 1726883124.98830: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000024ab 28983 1726883124.98969: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000024ab 28983 1726883124.98974: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 28983 1726883124.99132: no more pending results, returning what we have 28983 1726883124.99216: results queue empty 28983 1726883124.99218: checking for any_errors_fatal 28983 1726883124.99223: done checking for any_errors_fatal 28983 1726883124.99224: checking for max_fail_percentage 28983 1726883124.99226: done checking for max_fail_percentage 28983 1726883124.99227: checking to see if all hosts have failed and the running result is not ok 28983 1726883124.99228: done checking to see if all hosts have failed 28983 1726883124.99229: getting the remaining hosts for this loop 28983 1726883124.99231: done getting the remaining hosts for this loop 28983 1726883124.99258: getting the next task for host managed_node2 28983 1726883124.99274: done getting next task for host managed_node2 28983 1726883124.99279: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 28983 1726883124.99286: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883124.99314: getting variables 28983 1726883124.99316: in VariableManager get_vars() 28983 1726883124.99378: Calling all_inventory to load vars for managed_node2 28983 1726883124.99382: Calling groups_inventory to load vars for managed_node2 28983 1726883124.99385: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883124.99402: Calling all_plugins_play to load vars for managed_node2 28983 1726883124.99407: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883124.99411: Calling groups_plugins_play to load vars for managed_node2 28983 1726883125.01290: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883125.03222: done with get_vars() 28983 1726883125.03252: done getting variables 28983 1726883125.03303: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:45:25 -0400 (0:00:00.106) 0:02:35.031 ****** 28983 1726883125.03342: entering _queue_task() for managed_node2/package 28983 1726883125.03617: worker is 1 (out of 1 available) 28983 1726883125.03632: exiting _queue_task() for managed_node2/package 28983 1726883125.03649: done queuing things up, now waiting for results queue to drain 28983 1726883125.03651: waiting for pending results... 28983 1726883125.03903: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 28983 1726883125.04052: in run() - task 0affe814-3a2d-b16d-c0a7-0000000024ac 28983 1726883125.04114: variable 'ansible_search_path' from source: unknown 28983 1726883125.04118: variable 'ansible_search_path' from source: unknown 28983 1726883125.04146: calling self._execute() 28983 1726883125.04264: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883125.04269: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883125.04274: variable 'omit' from source: magic vars 28983 1726883125.04640: variable 'ansible_distribution_major_version' from source: facts 28983 1726883125.04653: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883125.04883: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883125.05129: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883125.05168: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883125.05203: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883125.05264: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883125.05372: variable 'network_packages' from source: role '' defaults 28983 1726883125.05463: variable '__network_provider_setup' from source: role '' defaults 28983 1726883125.05476: variable '__network_service_name_default_nm' from source: role '' defaults 28983 1726883125.05532: variable '__network_service_name_default_nm' from source: role '' defaults 28983 1726883125.05544: variable '__network_packages_default_nm' from source: role '' defaults 28983 1726883125.05596: variable '__network_packages_default_nm' from source: role '' defaults 28983 1726883125.05763: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883125.08139: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883125.08143: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883125.08158: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883125.08206: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883125.08245: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883125.08342: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883125.08386: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883125.08420: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883125.08476: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883125.08492: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883125.08529: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883125.08554: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883125.08580: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883125.08612: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883125.08625: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883125.08815: variable '__network_packages_default_gobject_packages' from source: role '' defaults 28983 1726883125.08922: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883125.08955: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883125.08978: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883125.09014: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883125.09028: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883125.09107: variable 'ansible_python' from source: facts 28983 1726883125.09120: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 28983 1726883125.09189: variable '__network_wpa_supplicant_required' from source: role '' defaults 28983 1726883125.09259: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 28983 1726883125.09370: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883125.09392: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883125.09412: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883125.09450: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883125.09464: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883125.09505: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883125.09530: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883125.09554: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883125.09589: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883125.09601: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883125.09722: variable 'network_connections' from source: include params 28983 1726883125.09728: variable 'interface' from source: play vars 28983 1726883125.09816: variable 'interface' from source: play vars 28983 1726883125.09880: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883125.09905: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883125.09929: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883125.09957: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883125.10003: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883125.10239: variable 'network_connections' from source: include params 28983 1726883125.10245: variable 'interface' from source: play vars 28983 1726883125.10331: variable 'interface' from source: play vars 28983 1726883125.10358: variable '__network_packages_default_wireless' from source: role '' defaults 28983 1726883125.10427: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883125.10752: variable 'network_connections' from source: include params 28983 1726883125.10756: variable 'interface' from source: play vars 28983 1726883125.10939: variable 'interface' from source: play vars 28983 1726883125.10943: variable '__network_packages_default_team' from source: role '' defaults 28983 1726883125.10955: variable '__network_team_connections_defined' from source: role '' defaults 28983 1726883125.11354: variable 'network_connections' from source: include params 28983 1726883125.11365: variable 'interface' from source: play vars 28983 1726883125.11446: variable 'interface' from source: play vars 28983 1726883125.11513: variable '__network_service_name_default_initscripts' from source: role '' defaults 28983 1726883125.11615: variable '__network_service_name_default_initscripts' from source: role '' defaults 28983 1726883125.11629: variable '__network_packages_default_initscripts' from source: role '' defaults 28983 1726883125.11716: variable '__network_packages_default_initscripts' from source: role '' defaults 28983 1726883125.11922: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 28983 1726883125.12739: variable 'network_connections' from source: include params 28983 1726883125.12743: variable 'interface' from source: play vars 28983 1726883125.12746: variable 'interface' from source: play vars 28983 1726883125.12748: variable 'ansible_distribution' from source: facts 28983 1726883125.12750: variable '__network_rh_distros' from source: role '' defaults 28983 1726883125.12753: variable 'ansible_distribution_major_version' from source: facts 28983 1726883125.12755: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 28983 1726883125.12894: variable 'ansible_distribution' from source: facts 28983 1726883125.12906: variable '__network_rh_distros' from source: role '' defaults 28983 1726883125.12924: variable 'ansible_distribution_major_version' from source: facts 28983 1726883125.12931: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 28983 1726883125.13092: variable 'ansible_distribution' from source: facts 28983 1726883125.13095: variable '__network_rh_distros' from source: role '' defaults 28983 1726883125.13107: variable 'ansible_distribution_major_version' from source: facts 28983 1726883125.13138: variable 'network_provider' from source: set_fact 28983 1726883125.13154: variable 'ansible_facts' from source: unknown 28983 1726883125.13732: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 28983 1726883125.13738: when evaluation is False, skipping this task 28983 1726883125.13741: _execute() done 28983 1726883125.13743: dumping result to json 28983 1726883125.13748: done dumping result, returning 28983 1726883125.13761: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [0affe814-3a2d-b16d-c0a7-0000000024ac] 28983 1726883125.13766: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000024ac 28983 1726883125.13866: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000024ac 28983 1726883125.13872: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 28983 1726883125.13929: no more pending results, returning what we have 28983 1726883125.13933: results queue empty 28983 1726883125.13935: checking for any_errors_fatal 28983 1726883125.13942: done checking for any_errors_fatal 28983 1726883125.13943: checking for max_fail_percentage 28983 1726883125.13945: done checking for max_fail_percentage 28983 1726883125.13946: checking to see if all hosts have failed and the running result is not ok 28983 1726883125.13947: done checking to see if all hosts have failed 28983 1726883125.13948: getting the remaining hosts for this loop 28983 1726883125.13950: done getting the remaining hosts for this loop 28983 1726883125.13955: getting the next task for host managed_node2 28983 1726883125.13963: done getting next task for host managed_node2 28983 1726883125.13967: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 28983 1726883125.13975: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883125.14007: getting variables 28983 1726883125.14009: in VariableManager get_vars() 28983 1726883125.14073: Calling all_inventory to load vars for managed_node2 28983 1726883125.14077: Calling groups_inventory to load vars for managed_node2 28983 1726883125.14081: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883125.14090: Calling all_plugins_play to load vars for managed_node2 28983 1726883125.14094: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883125.14097: Calling groups_plugins_play to load vars for managed_node2 28983 1726883125.16038: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883125.17769: done with get_vars() 28983 1726883125.17794: done getting variables 28983 1726883125.17847: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:45:25 -0400 (0:00:00.145) 0:02:35.176 ****** 28983 1726883125.17879: entering _queue_task() for managed_node2/package 28983 1726883125.18128: worker is 1 (out of 1 available) 28983 1726883125.18146: exiting _queue_task() for managed_node2/package 28983 1726883125.18159: done queuing things up, now waiting for results queue to drain 28983 1726883125.18161: waiting for pending results... 28983 1726883125.18359: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 28983 1726883125.18474: in run() - task 0affe814-3a2d-b16d-c0a7-0000000024ad 28983 1726883125.18486: variable 'ansible_search_path' from source: unknown 28983 1726883125.18490: variable 'ansible_search_path' from source: unknown 28983 1726883125.18527: calling self._execute() 28983 1726883125.18616: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883125.18627: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883125.18636: variable 'omit' from source: magic vars 28983 1726883125.18978: variable 'ansible_distribution_major_version' from source: facts 28983 1726883125.18989: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883125.19167: variable 'network_state' from source: role '' defaults 28983 1726883125.19174: Evaluated conditional (network_state != {}): False 28983 1726883125.19183: when evaluation is False, skipping this task 28983 1726883125.19195: _execute() done 28983 1726883125.19198: dumping result to json 28983 1726883125.19201: done dumping result, returning 28983 1726883125.19205: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affe814-3a2d-b16d-c0a7-0000000024ad] 28983 1726883125.19208: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000024ad 28983 1726883125.19298: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000024ad 28983 1726883125.19301: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28983 1726883125.19354: no more pending results, returning what we have 28983 1726883125.19358: results queue empty 28983 1726883125.19359: checking for any_errors_fatal 28983 1726883125.19364: done checking for any_errors_fatal 28983 1726883125.19365: checking for max_fail_percentage 28983 1726883125.19367: done checking for max_fail_percentage 28983 1726883125.19368: checking to see if all hosts have failed and the running result is not ok 28983 1726883125.19369: done checking to see if all hosts have failed 28983 1726883125.19372: getting the remaining hosts for this loop 28983 1726883125.19374: done getting the remaining hosts for this loop 28983 1726883125.19379: getting the next task for host managed_node2 28983 1726883125.19387: done getting next task for host managed_node2 28983 1726883125.19391: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 28983 1726883125.19397: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883125.19424: getting variables 28983 1726883125.19425: in VariableManager get_vars() 28983 1726883125.19525: Calling all_inventory to load vars for managed_node2 28983 1726883125.19528: Calling groups_inventory to load vars for managed_node2 28983 1726883125.19530: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883125.19539: Calling all_plugins_play to load vars for managed_node2 28983 1726883125.19542: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883125.19544: Calling groups_plugins_play to load vars for managed_node2 28983 1726883125.20771: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883125.22505: done with get_vars() 28983 1726883125.22527: done getting variables 28983 1726883125.22575: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:45:25 -0400 (0:00:00.047) 0:02:35.223 ****** 28983 1726883125.22606: entering _queue_task() for managed_node2/package 28983 1726883125.22822: worker is 1 (out of 1 available) 28983 1726883125.22839: exiting _queue_task() for managed_node2/package 28983 1726883125.22853: done queuing things up, now waiting for results queue to drain 28983 1726883125.22855: waiting for pending results... 28983 1726883125.23059: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 28983 1726883125.23173: in run() - task 0affe814-3a2d-b16d-c0a7-0000000024ae 28983 1726883125.23194: variable 'ansible_search_path' from source: unknown 28983 1726883125.23199: variable 'ansible_search_path' from source: unknown 28983 1726883125.23227: calling self._execute() 28983 1726883125.23322: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883125.23328: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883125.23341: variable 'omit' from source: magic vars 28983 1726883125.23676: variable 'ansible_distribution_major_version' from source: facts 28983 1726883125.23689: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883125.23804: variable 'network_state' from source: role '' defaults 28983 1726883125.23814: Evaluated conditional (network_state != {}): False 28983 1726883125.23817: when evaluation is False, skipping this task 28983 1726883125.23820: _execute() done 28983 1726883125.23825: dumping result to json 28983 1726883125.23830: done dumping result, returning 28983 1726883125.23839: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affe814-3a2d-b16d-c0a7-0000000024ae] 28983 1726883125.23847: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000024ae 28983 1726883125.23954: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000024ae 28983 1726883125.23960: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28983 1726883125.24011: no more pending results, returning what we have 28983 1726883125.24015: results queue empty 28983 1726883125.24016: checking for any_errors_fatal 28983 1726883125.24022: done checking for any_errors_fatal 28983 1726883125.24023: checking for max_fail_percentage 28983 1726883125.24025: done checking for max_fail_percentage 28983 1726883125.24026: checking to see if all hosts have failed and the running result is not ok 28983 1726883125.24027: done checking to see if all hosts have failed 28983 1726883125.24028: getting the remaining hosts for this loop 28983 1726883125.24030: done getting the remaining hosts for this loop 28983 1726883125.24036: getting the next task for host managed_node2 28983 1726883125.24044: done getting next task for host managed_node2 28983 1726883125.24049: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 28983 1726883125.24055: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883125.24081: getting variables 28983 1726883125.24083: in VariableManager get_vars() 28983 1726883125.24122: Calling all_inventory to load vars for managed_node2 28983 1726883125.24130: Calling groups_inventory to load vars for managed_node2 28983 1726883125.24132: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883125.24141: Calling all_plugins_play to load vars for managed_node2 28983 1726883125.24143: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883125.24145: Calling groups_plugins_play to load vars for managed_node2 28983 1726883125.25379: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883125.27010: done with get_vars() 28983 1726883125.27031: done getting variables 28983 1726883125.27085: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:45:25 -0400 (0:00:00.045) 0:02:35.268 ****** 28983 1726883125.27113: entering _queue_task() for managed_node2/service 28983 1726883125.27318: worker is 1 (out of 1 available) 28983 1726883125.27332: exiting _queue_task() for managed_node2/service 28983 1726883125.27347: done queuing things up, now waiting for results queue to drain 28983 1726883125.27349: waiting for pending results... 28983 1726883125.27546: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 28983 1726883125.27660: in run() - task 0affe814-3a2d-b16d-c0a7-0000000024af 28983 1726883125.27675: variable 'ansible_search_path' from source: unknown 28983 1726883125.27681: variable 'ansible_search_path' from source: unknown 28983 1726883125.27714: calling self._execute() 28983 1726883125.27800: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883125.27806: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883125.27816: variable 'omit' from source: magic vars 28983 1726883125.28135: variable 'ansible_distribution_major_version' from source: facts 28983 1726883125.28152: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883125.28255: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883125.28424: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883125.30586: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883125.30639: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883125.30684: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883125.30715: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883125.30741: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883125.30810: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883125.30836: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883125.30863: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883125.30901: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883125.30913: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883125.30956: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883125.30985: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883125.31005: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883125.31037: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883125.31050: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883125.31092: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883125.31113: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883125.31135: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883125.31166: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883125.31182: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883125.31331: variable 'network_connections' from source: include params 28983 1726883125.31345: variable 'interface' from source: play vars 28983 1726883125.31400: variable 'interface' from source: play vars 28983 1726883125.31467: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883125.31604: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883125.31643: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883125.31673: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883125.31707: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883125.31749: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883125.31767: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883125.31790: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883125.31811: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883125.31858: variable '__network_team_connections_defined' from source: role '' defaults 28983 1726883125.32061: variable 'network_connections' from source: include params 28983 1726883125.32065: variable 'interface' from source: play vars 28983 1726883125.32120: variable 'interface' from source: play vars 28983 1726883125.32141: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 28983 1726883125.32145: when evaluation is False, skipping this task 28983 1726883125.32148: _execute() done 28983 1726883125.32154: dumping result to json 28983 1726883125.32157: done dumping result, returning 28983 1726883125.32166: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affe814-3a2d-b16d-c0a7-0000000024af] 28983 1726883125.32174: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000024af 28983 1726883125.32272: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000024af 28983 1726883125.32281: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 28983 1726883125.32348: no more pending results, returning what we have 28983 1726883125.32351: results queue empty 28983 1726883125.32352: checking for any_errors_fatal 28983 1726883125.32360: done checking for any_errors_fatal 28983 1726883125.32360: checking for max_fail_percentage 28983 1726883125.32363: done checking for max_fail_percentage 28983 1726883125.32364: checking to see if all hosts have failed and the running result is not ok 28983 1726883125.32365: done checking to see if all hosts have failed 28983 1726883125.32366: getting the remaining hosts for this loop 28983 1726883125.32368: done getting the remaining hosts for this loop 28983 1726883125.32375: getting the next task for host managed_node2 28983 1726883125.32384: done getting next task for host managed_node2 28983 1726883125.32389: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 28983 1726883125.32402: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883125.32430: getting variables 28983 1726883125.32432: in VariableManager get_vars() 28983 1726883125.32480: Calling all_inventory to load vars for managed_node2 28983 1726883125.32483: Calling groups_inventory to load vars for managed_node2 28983 1726883125.32485: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883125.32494: Calling all_plugins_play to load vars for managed_node2 28983 1726883125.32497: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883125.32509: Calling groups_plugins_play to load vars for managed_node2 28983 1726883125.33954: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883125.35561: done with get_vars() 28983 1726883125.35585: done getting variables 28983 1726883125.35630: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:45:25 -0400 (0:00:00.085) 0:02:35.354 ****** 28983 1726883125.35662: entering _queue_task() for managed_node2/service 28983 1726883125.35895: worker is 1 (out of 1 available) 28983 1726883125.35909: exiting _queue_task() for managed_node2/service 28983 1726883125.35926: done queuing things up, now waiting for results queue to drain 28983 1726883125.35927: waiting for pending results... 28983 1726883125.36152: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 28983 1726883125.36285: in run() - task 0affe814-3a2d-b16d-c0a7-0000000024b0 28983 1726883125.36298: variable 'ansible_search_path' from source: unknown 28983 1726883125.36301: variable 'ansible_search_path' from source: unknown 28983 1726883125.36337: calling self._execute() 28983 1726883125.36426: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883125.36433: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883125.36445: variable 'omit' from source: magic vars 28983 1726883125.36784: variable 'ansible_distribution_major_version' from source: facts 28983 1726883125.36795: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883125.36948: variable 'network_provider' from source: set_fact 28983 1726883125.36954: variable 'network_state' from source: role '' defaults 28983 1726883125.36964: Evaluated conditional (network_provider == "nm" or network_state != {}): True 28983 1726883125.36970: variable 'omit' from source: magic vars 28983 1726883125.37023: variable 'omit' from source: magic vars 28983 1726883125.37141: variable 'network_service_name' from source: role '' defaults 28983 1726883125.37147: variable 'network_service_name' from source: role '' defaults 28983 1726883125.37210: variable '__network_provider_setup' from source: role '' defaults 28983 1726883125.37214: variable '__network_service_name_default_nm' from source: role '' defaults 28983 1726883125.37276: variable '__network_service_name_default_nm' from source: role '' defaults 28983 1726883125.37284: variable '__network_packages_default_nm' from source: role '' defaults 28983 1726883125.37337: variable '__network_packages_default_nm' from source: role '' defaults 28983 1726883125.37739: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883125.39685: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883125.39754: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883125.39792: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883125.39825: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883125.39850: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883125.39920: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883125.39947: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883125.39967: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883125.40006: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883125.40021: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883125.40062: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883125.40085: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883125.40105: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883125.40149: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883125.40161: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883125.40373: variable '__network_packages_default_gobject_packages' from source: role '' defaults 28983 1726883125.40469: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883125.40492: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883125.40512: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883125.40548: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883125.40563: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883125.40632: variable 'ansible_python' from source: facts 28983 1726883125.40654: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 28983 1726883125.40717: variable '__network_wpa_supplicant_required' from source: role '' defaults 28983 1726883125.40788: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 28983 1726883125.40896: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883125.40918: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883125.40939: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883125.40974: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883125.40989: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883125.41029: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883125.41054: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883125.41076: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883125.41141: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883125.41144: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883125.41288: variable 'network_connections' from source: include params 28983 1726883125.41296: variable 'interface' from source: play vars 28983 1726883125.41376: variable 'interface' from source: play vars 28983 1726883125.41478: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883125.41695: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883125.41724: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883125.41947: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883125.41951: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883125.41954: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883125.41965: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883125.42015: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883125.42081: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883125.42143: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883125.42604: variable 'network_connections' from source: include params 28983 1726883125.42627: variable 'interface' from source: play vars 28983 1726883125.42691: variable 'interface' from source: play vars 28983 1726883125.42725: variable '__network_packages_default_wireless' from source: role '' defaults 28983 1726883125.42791: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883125.43042: variable 'network_connections' from source: include params 28983 1726883125.43047: variable 'interface' from source: play vars 28983 1726883125.43104: variable 'interface' from source: play vars 28983 1726883125.43124: variable '__network_packages_default_team' from source: role '' defaults 28983 1726883125.43196: variable '__network_team_connections_defined' from source: role '' defaults 28983 1726883125.43441: variable 'network_connections' from source: include params 28983 1726883125.43446: variable 'interface' from source: play vars 28983 1726883125.43507: variable 'interface' from source: play vars 28983 1726883125.43551: variable '__network_service_name_default_initscripts' from source: role '' defaults 28983 1726883125.43605: variable '__network_service_name_default_initscripts' from source: role '' defaults 28983 1726883125.43613: variable '__network_packages_default_initscripts' from source: role '' defaults 28983 1726883125.43663: variable '__network_packages_default_initscripts' from source: role '' defaults 28983 1726883125.43850: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 28983 1726883125.44275: variable 'network_connections' from source: include params 28983 1726883125.44278: variable 'interface' from source: play vars 28983 1726883125.44327: variable 'interface' from source: play vars 28983 1726883125.44337: variable 'ansible_distribution' from source: facts 28983 1726883125.44340: variable '__network_rh_distros' from source: role '' defaults 28983 1726883125.44353: variable 'ansible_distribution_major_version' from source: facts 28983 1726883125.44365: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 28983 1726883125.44509: variable 'ansible_distribution' from source: facts 28983 1726883125.44513: variable '__network_rh_distros' from source: role '' defaults 28983 1726883125.44519: variable 'ansible_distribution_major_version' from source: facts 28983 1726883125.44526: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 28983 1726883125.44674: variable 'ansible_distribution' from source: facts 28983 1726883125.44678: variable '__network_rh_distros' from source: role '' defaults 28983 1726883125.44680: variable 'ansible_distribution_major_version' from source: facts 28983 1726883125.44712: variable 'network_provider' from source: set_fact 28983 1726883125.44730: variable 'omit' from source: magic vars 28983 1726883125.44755: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883125.44779: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883125.44804: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883125.44815: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883125.44824: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883125.44852: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883125.44855: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883125.44860: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883125.45058: Set connection var ansible_connection to ssh 28983 1726883125.45062: Set connection var ansible_shell_executable to /bin/sh 28983 1726883125.45064: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883125.45066: Set connection var ansible_timeout to 10 28983 1726883125.45068: Set connection var ansible_pipelining to False 28983 1726883125.45073: Set connection var ansible_shell_type to sh 28983 1726883125.45076: variable 'ansible_shell_executable' from source: unknown 28983 1726883125.45078: variable 'ansible_connection' from source: unknown 28983 1726883125.45080: variable 'ansible_module_compression' from source: unknown 28983 1726883125.45083: variable 'ansible_shell_type' from source: unknown 28983 1726883125.45085: variable 'ansible_shell_executable' from source: unknown 28983 1726883125.45087: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883125.45089: variable 'ansible_pipelining' from source: unknown 28983 1726883125.45091: variable 'ansible_timeout' from source: unknown 28983 1726883125.45093: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883125.45158: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883125.45169: variable 'omit' from source: magic vars 28983 1726883125.45174: starting attempt loop 28983 1726883125.45177: running the handler 28983 1726883125.45246: variable 'ansible_facts' from source: unknown 28983 1726883125.46738: _low_level_execute_command(): starting 28983 1726883125.46745: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726883125.47833: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883125.47869: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726883125.47875: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883125.47878: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found <<< 28983 1726883125.47881: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883125.47946: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883125.47982: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883125.48049: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883125.49835: stdout chunk (state=3): >>>/root <<< 28983 1726883125.49941: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883125.50155: stderr chunk (state=3): >>><<< 28983 1726883125.50159: stdout chunk (state=3): >>><<< 28983 1726883125.50221: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883125.50249: _low_level_execute_command(): starting 28983 1726883125.50281: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726883125.5023332-34518-208348498494834 `" && echo ansible-tmp-1726883125.5023332-34518-208348498494834="` echo /root/.ansible/tmp/ansible-tmp-1726883125.5023332-34518-208348498494834 `" ) && sleep 0' 28983 1726883125.50979: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883125.51006: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726883125.51094: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883125.51137: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883125.51155: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883125.51173: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883125.51282: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883125.53428: stdout chunk (state=3): >>>ansible-tmp-1726883125.5023332-34518-208348498494834=/root/.ansible/tmp/ansible-tmp-1726883125.5023332-34518-208348498494834 <<< 28983 1726883125.53510: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883125.53591: stderr chunk (state=3): >>><<< 28983 1726883125.53601: stdout chunk (state=3): >>><<< 28983 1726883125.53948: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726883125.5023332-34518-208348498494834=/root/.ansible/tmp/ansible-tmp-1726883125.5023332-34518-208348498494834 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883125.53952: variable 'ansible_module_compression' from source: unknown 28983 1726883125.53955: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 28983 1726883125.53958: variable 'ansible_facts' from source: unknown 28983 1726883125.54264: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726883125.5023332-34518-208348498494834/AnsiballZ_systemd.py 28983 1726883125.54615: Sending initial data 28983 1726883125.54753: Sent initial data (156 bytes) 28983 1726883125.55892: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883125.55896: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883125.55904: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883125.55906: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883125.56039: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883125.56088: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883125.56359: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883125.57983: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726883125.58030: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726883125.58118: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmps7xbw4sv /root/.ansible/tmp/ansible-tmp-1726883125.5023332-34518-208348498494834/AnsiballZ_systemd.py <<< 28983 1726883125.58128: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726883125.5023332-34518-208348498494834/AnsiballZ_systemd.py" <<< 28983 1726883125.58279: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmps7xbw4sv" to remote "/root/.ansible/tmp/ansible-tmp-1726883125.5023332-34518-208348498494834/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726883125.5023332-34518-208348498494834/AnsiballZ_systemd.py" <<< 28983 1726883125.63405: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883125.63515: stderr chunk (state=3): >>><<< 28983 1726883125.63518: stdout chunk (state=3): >>><<< 28983 1726883125.63707: done transferring module to remote 28983 1726883125.63711: _low_level_execute_command(): starting 28983 1726883125.63719: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726883125.5023332-34518-208348498494834/ /root/.ansible/tmp/ansible-tmp-1726883125.5023332-34518-208348498494834/AnsiballZ_systemd.py && sleep 0' 28983 1726883125.64907: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726883125.64911: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883125.64914: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883125.65259: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883125.65263: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883125.67195: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883125.67483: stderr chunk (state=3): >>><<< 28983 1726883125.67488: stdout chunk (state=3): >>><<< 28983 1726883125.67491: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883125.67494: _low_level_execute_command(): starting 28983 1726883125.67496: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726883125.5023332-34518-208348498494834/AnsiballZ_systemd.py && sleep 0' 28983 1726883125.68222: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883125.68242: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration <<< 28983 1726883125.68259: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883125.68314: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883125.68343: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883125.68424: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883126.01221: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3307", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ExecMainStartTimestampMonotonic": "237115079", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3307", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4837", "MemoryCurrent": "4538368", "MemoryAvailable": "infinity", "CPUUsageNSec": "1740563000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "<<< 28983 1726883126.01466: stdout chunk (state=3): >>>infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target NetworkManager-wait-online.service cloud-init.service network.target network.service multi-user.target", "After": "systemd-journald.socket dbus-broker.service dbus.socket system.slice cloud-init-local.service sysinit.target network-pre.target basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:40:53 EDT", "StateChangeTimestampMonotonic": "816797668", "InactiveExitTimestamp": "Fri 2024-09-20 21:31:13 EDT", "InactiveExitTimestampMonotonic": "237115438", "ActiveEnterTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ActiveEnterTimestampMonotonic": "237210316", "ActiveExitTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ActiveExitTimestampMonotonic": "237082173", "InactiveEnterTimestamp": "Fri 2024-09-20 21:31:13 EDT", "InactiveEnterTimestampMonotonic": "237109124", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ConditionTimestampMonotonic": "237109810", "AssertTimestamp": "Fri 2024-09-20 21:31:13 EDT", "AssertTimestampMonotonic": "237109813", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ac5f6302898c463683c4bdf580ab7e3e", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 28983 1726883126.03241: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883126.03260: stderr chunk (state=3): >>>Shared connection to 10.31.46.139 closed. <<< 28983 1726883126.03375: stderr chunk (state=3): >>><<< 28983 1726883126.03386: stdout chunk (state=3): >>><<< 28983 1726883126.03414: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3307", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ExecMainStartTimestampMonotonic": "237115079", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3307", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4837", "MemoryCurrent": "4538368", "MemoryAvailable": "infinity", "CPUUsageNSec": "1740563000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target NetworkManager-wait-online.service cloud-init.service network.target network.service multi-user.target", "After": "systemd-journald.socket dbus-broker.service dbus.socket system.slice cloud-init-local.service sysinit.target network-pre.target basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:40:53 EDT", "StateChangeTimestampMonotonic": "816797668", "InactiveExitTimestamp": "Fri 2024-09-20 21:31:13 EDT", "InactiveExitTimestampMonotonic": "237115438", "ActiveEnterTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ActiveEnterTimestampMonotonic": "237210316", "ActiveExitTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ActiveExitTimestampMonotonic": "237082173", "InactiveEnterTimestamp": "Fri 2024-09-20 21:31:13 EDT", "InactiveEnterTimestampMonotonic": "237109124", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ConditionTimestampMonotonic": "237109810", "AssertTimestamp": "Fri 2024-09-20 21:31:13 EDT", "AssertTimestampMonotonic": "237109813", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ac5f6302898c463683c4bdf580ab7e3e", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726883126.03754: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726883125.5023332-34518-208348498494834/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726883126.03777: _low_level_execute_command(): starting 28983 1726883126.03785: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726883125.5023332-34518-208348498494834/ > /dev/null 2>&1 && sleep 0' 28983 1726883126.04492: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883126.04516: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration <<< 28983 1726883126.04519: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883126.04584: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883126.04594: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883126.04656: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883126.06616: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883126.06670: stderr chunk (state=3): >>><<< 28983 1726883126.06678: stdout chunk (state=3): >>><<< 28983 1726883126.06722: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883126.06725: handler run complete 28983 1726883126.06801: attempt loop complete, returning result 28983 1726883126.06804: _execute() done 28983 1726883126.06807: dumping result to json 28983 1726883126.06825: done dumping result, returning 28983 1726883126.06837: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affe814-3a2d-b16d-c0a7-0000000024b0] 28983 1726883126.06851: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000024b0 28983 1726883126.07196: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000024b0 28983 1726883126.07198: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28983 1726883126.07357: no more pending results, returning what we have 28983 1726883126.07361: results queue empty 28983 1726883126.07362: checking for any_errors_fatal 28983 1726883126.07375: done checking for any_errors_fatal 28983 1726883126.07376: checking for max_fail_percentage 28983 1726883126.07378: done checking for max_fail_percentage 28983 1726883126.07379: checking to see if all hosts have failed and the running result is not ok 28983 1726883126.07380: done checking to see if all hosts have failed 28983 1726883126.07380: getting the remaining hosts for this loop 28983 1726883126.07382: done getting the remaining hosts for this loop 28983 1726883126.07388: getting the next task for host managed_node2 28983 1726883126.07396: done getting next task for host managed_node2 28983 1726883126.07400: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 28983 1726883126.07407: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883126.07426: getting variables 28983 1726883126.07427: in VariableManager get_vars() 28983 1726883126.07538: Calling all_inventory to load vars for managed_node2 28983 1726883126.07542: Calling groups_inventory to load vars for managed_node2 28983 1726883126.07544: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883126.07570: Calling all_plugins_play to load vars for managed_node2 28983 1726883126.07575: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883126.07580: Calling groups_plugins_play to load vars for managed_node2 28983 1726883126.10259: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883126.12091: done with get_vars() 28983 1726883126.12118: done getting variables 28983 1726883126.12172: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:45:26 -0400 (0:00:00.765) 0:02:36.119 ****** 28983 1726883126.12207: entering _queue_task() for managed_node2/service 28983 1726883126.12496: worker is 1 (out of 1 available) 28983 1726883126.12512: exiting _queue_task() for managed_node2/service 28983 1726883126.12527: done queuing things up, now waiting for results queue to drain 28983 1726883126.12529: waiting for pending results... 28983 1726883126.13052: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 28983 1726883126.13058: in run() - task 0affe814-3a2d-b16d-c0a7-0000000024b1 28983 1726883126.13061: variable 'ansible_search_path' from source: unknown 28983 1726883126.13066: variable 'ansible_search_path' from source: unknown 28983 1726883126.13069: calling self._execute() 28983 1726883126.13242: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883126.13247: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883126.13250: variable 'omit' from source: magic vars 28983 1726883126.13708: variable 'ansible_distribution_major_version' from source: facts 28983 1726883126.13730: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883126.13890: variable 'network_provider' from source: set_fact 28983 1726883126.13904: Evaluated conditional (network_provider == "nm"): True 28983 1726883126.14029: variable '__network_wpa_supplicant_required' from source: role '' defaults 28983 1726883126.14150: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 28983 1726883126.14441: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883126.18519: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883126.18631: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883126.18686: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883126.18737: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883126.18779: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883126.18895: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883126.18939: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883126.19039: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883126.19043: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883126.19061: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883126.19127: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883126.19163: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883126.19200: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883126.19446: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883126.19450: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883126.19453: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883126.19455: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883126.19550: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883126.19693: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883126.19717: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883126.19937: variable 'network_connections' from source: include params 28983 1726883126.19958: variable 'interface' from source: play vars 28983 1726883126.20092: variable 'interface' from source: play vars 28983 1726883126.20208: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883126.20419: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883126.20473: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883126.20518: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883126.20564: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883126.20649: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883126.20685: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883126.20721: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883126.20759: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883126.20819: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883126.21183: variable 'network_connections' from source: include params 28983 1726883126.21195: variable 'interface' from source: play vars 28983 1726883126.21445: variable 'interface' from source: play vars 28983 1726883126.21449: Evaluated conditional (__network_wpa_supplicant_required): False 28983 1726883126.21451: when evaluation is False, skipping this task 28983 1726883126.21453: _execute() done 28983 1726883126.21456: dumping result to json 28983 1726883126.21458: done dumping result, returning 28983 1726883126.21461: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affe814-3a2d-b16d-c0a7-0000000024b1] 28983 1726883126.21475: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000024b1 28983 1726883126.21564: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000024b1 28983 1726883126.21568: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 28983 1726883126.21676: no more pending results, returning what we have 28983 1726883126.21700: results queue empty 28983 1726883126.21702: checking for any_errors_fatal 28983 1726883126.21765: done checking for any_errors_fatal 28983 1726883126.22037: checking for max_fail_percentage 28983 1726883126.22040: done checking for max_fail_percentage 28983 1726883126.22041: checking to see if all hosts have failed and the running result is not ok 28983 1726883126.22042: done checking to see if all hosts have failed 28983 1726883126.22043: getting the remaining hosts for this loop 28983 1726883126.22046: done getting the remaining hosts for this loop 28983 1726883126.22051: getting the next task for host managed_node2 28983 1726883126.22060: done getting next task for host managed_node2 28983 1726883126.22065: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 28983 1726883126.22080: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883126.22111: getting variables 28983 1726883126.22114: in VariableManager get_vars() 28983 1726883126.22167: Calling all_inventory to load vars for managed_node2 28983 1726883126.22173: Calling groups_inventory to load vars for managed_node2 28983 1726883126.22176: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883126.22187: Calling all_plugins_play to load vars for managed_node2 28983 1726883126.22191: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883126.22195: Calling groups_plugins_play to load vars for managed_node2 28983 1726883126.25939: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883126.30559: done with get_vars() 28983 1726883126.30601: done getting variables 28983 1726883126.30675: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:45:26 -0400 (0:00:00.185) 0:02:36.304 ****** 28983 1726883126.30716: entering _queue_task() for managed_node2/service 28983 1726883126.31102: worker is 1 (out of 1 available) 28983 1726883126.31116: exiting _queue_task() for managed_node2/service 28983 1726883126.31131: done queuing things up, now waiting for results queue to drain 28983 1726883126.31135: waiting for pending results... 28983 1726883126.32052: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 28983 1726883126.32237: in run() - task 0affe814-3a2d-b16d-c0a7-0000000024b2 28983 1726883126.32259: variable 'ansible_search_path' from source: unknown 28983 1726883126.32268: variable 'ansible_search_path' from source: unknown 28983 1726883126.32318: calling self._execute() 28983 1726883126.32441: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883126.32455: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883126.32474: variable 'omit' from source: magic vars 28983 1726883126.32927: variable 'ansible_distribution_major_version' from source: facts 28983 1726883126.32952: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883126.33169: variable 'network_provider' from source: set_fact 28983 1726883126.33172: Evaluated conditional (network_provider == "initscripts"): False 28983 1726883126.33175: when evaluation is False, skipping this task 28983 1726883126.33177: _execute() done 28983 1726883126.33180: dumping result to json 28983 1726883126.33182: done dumping result, returning 28983 1726883126.33185: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [0affe814-3a2d-b16d-c0a7-0000000024b2] 28983 1726883126.33187: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000024b2 skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28983 1726883126.33400: no more pending results, returning what we have 28983 1726883126.33404: results queue empty 28983 1726883126.33405: checking for any_errors_fatal 28983 1726883126.33418: done checking for any_errors_fatal 28983 1726883126.33419: checking for max_fail_percentage 28983 1726883126.33421: done checking for max_fail_percentage 28983 1726883126.33422: checking to see if all hosts have failed and the running result is not ok 28983 1726883126.33423: done checking to see if all hosts have failed 28983 1726883126.33424: getting the remaining hosts for this loop 28983 1726883126.33427: done getting the remaining hosts for this loop 28983 1726883126.33432: getting the next task for host managed_node2 28983 1726883126.33445: done getting next task for host managed_node2 28983 1726883126.33450: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 28983 1726883126.33459: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883126.33502: getting variables 28983 1726883126.33504: in VariableManager get_vars() 28983 1726883126.33762: Calling all_inventory to load vars for managed_node2 28983 1726883126.33766: Calling groups_inventory to load vars for managed_node2 28983 1726883126.33769: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883126.33779: Calling all_plugins_play to load vars for managed_node2 28983 1726883126.33782: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883126.33785: Calling groups_plugins_play to load vars for managed_node2 28983 1726883126.34364: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000024b2 28983 1726883126.34367: WORKER PROCESS EXITING 28983 1726883126.38569: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883126.42758: done with get_vars() 28983 1726883126.42806: done getting variables 28983 1726883126.42887: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:45:26 -0400 (0:00:00.122) 0:02:36.427 ****** 28983 1726883126.42932: entering _queue_task() for managed_node2/copy 28983 1726883126.43458: worker is 1 (out of 1 available) 28983 1726883126.43475: exiting _queue_task() for managed_node2/copy 28983 1726883126.43488: done queuing things up, now waiting for results queue to drain 28983 1726883126.43490: waiting for pending results... 28983 1726883126.43747: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 28983 1726883126.44040: in run() - task 0affe814-3a2d-b16d-c0a7-0000000024b3 28983 1726883126.44044: variable 'ansible_search_path' from source: unknown 28983 1726883126.44047: variable 'ansible_search_path' from source: unknown 28983 1726883126.44050: calling self._execute() 28983 1726883126.44164: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883126.44184: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883126.44239: variable 'omit' from source: magic vars 28983 1726883126.44963: variable 'ansible_distribution_major_version' from source: facts 28983 1726883126.44990: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883126.45160: variable 'network_provider' from source: set_fact 28983 1726883126.45182: Evaluated conditional (network_provider == "initscripts"): False 28983 1726883126.45191: when evaluation is False, skipping this task 28983 1726883126.45203: _execute() done 28983 1726883126.45240: dumping result to json 28983 1726883126.45244: done dumping result, returning 28983 1726883126.45247: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affe814-3a2d-b16d-c0a7-0000000024b3] 28983 1726883126.45249: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000024b3 skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 28983 1726883126.45583: no more pending results, returning what we have 28983 1726883126.45589: results queue empty 28983 1726883126.45590: checking for any_errors_fatal 28983 1726883126.45714: done checking for any_errors_fatal 28983 1726883126.45716: checking for max_fail_percentage 28983 1726883126.45719: done checking for max_fail_percentage 28983 1726883126.45720: checking to see if all hosts have failed and the running result is not ok 28983 1726883126.45721: done checking to see if all hosts have failed 28983 1726883126.45722: getting the remaining hosts for this loop 28983 1726883126.45724: done getting the remaining hosts for this loop 28983 1726883126.45730: getting the next task for host managed_node2 28983 1726883126.45828: done getting next task for host managed_node2 28983 1726883126.45836: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 28983 1726883126.45843: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883126.45877: getting variables 28983 1726883126.45879: in VariableManager get_vars() 28983 1726883126.46110: Calling all_inventory to load vars for managed_node2 28983 1726883126.46113: Calling groups_inventory to load vars for managed_node2 28983 1726883126.46116: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883126.46126: Calling all_plugins_play to load vars for managed_node2 28983 1726883126.46129: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883126.46133: Calling groups_plugins_play to load vars for managed_node2 28983 1726883126.46727: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000024b3 28983 1726883126.46731: WORKER PROCESS EXITING 28983 1726883126.49143: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883126.52460: done with get_vars() 28983 1726883126.52502: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:45:26 -0400 (0:00:00.096) 0:02:36.523 ****** 28983 1726883126.52615: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 28983 1726883126.53015: worker is 1 (out of 1 available) 28983 1726883126.53031: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 28983 1726883126.53047: done queuing things up, now waiting for results queue to drain 28983 1726883126.53049: waiting for pending results... 28983 1726883126.53320: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 28983 1726883126.53496: in run() - task 0affe814-3a2d-b16d-c0a7-0000000024b4 28983 1726883126.53512: variable 'ansible_search_path' from source: unknown 28983 1726883126.53517: variable 'ansible_search_path' from source: unknown 28983 1726883126.53560: calling self._execute() 28983 1726883126.53775: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883126.53779: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883126.53783: variable 'omit' from source: magic vars 28983 1726883126.54147: variable 'ansible_distribution_major_version' from source: facts 28983 1726883126.54160: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883126.54167: variable 'omit' from source: magic vars 28983 1726883126.54255: variable 'omit' from source: magic vars 28983 1726883126.54456: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883126.57277: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883126.57383: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883126.57432: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883126.57495: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883126.57532: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883126.57839: variable 'network_provider' from source: set_fact 28983 1726883126.57843: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883126.57847: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883126.57891: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883126.57952: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883126.57986: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883126.58090: variable 'omit' from source: magic vars 28983 1726883126.58238: variable 'omit' from source: magic vars 28983 1726883126.58382: variable 'network_connections' from source: include params 28983 1726883126.58409: variable 'interface' from source: play vars 28983 1726883126.58494: variable 'interface' from source: play vars 28983 1726883126.58697: variable 'omit' from source: magic vars 28983 1726883126.58712: variable '__lsr_ansible_managed' from source: task vars 28983 1726883126.58803: variable '__lsr_ansible_managed' from source: task vars 28983 1726883126.59064: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 28983 1726883126.59380: Loaded config def from plugin (lookup/template) 28983 1726883126.59396: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 28983 1726883126.59431: File lookup term: get_ansible_managed.j2 28983 1726883126.59441: variable 'ansible_search_path' from source: unknown 28983 1726883126.59452: evaluation_path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 28983 1726883126.59474: search_path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 28983 1726883126.59599: variable 'ansible_search_path' from source: unknown 28983 1726883126.70514: variable 'ansible_managed' from source: unknown 28983 1726883126.70777: variable 'omit' from source: magic vars 28983 1726883126.70815: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883126.70862: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883126.70893: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883126.70920: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883126.70945: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883126.70985: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883126.70994: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883126.71003: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883126.71127: Set connection var ansible_connection to ssh 28983 1726883126.71154: Set connection var ansible_shell_executable to /bin/sh 28983 1726883126.71173: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883126.71238: Set connection var ansible_timeout to 10 28983 1726883126.71241: Set connection var ansible_pipelining to False 28983 1726883126.71244: Set connection var ansible_shell_type to sh 28983 1726883126.71246: variable 'ansible_shell_executable' from source: unknown 28983 1726883126.71248: variable 'ansible_connection' from source: unknown 28983 1726883126.71250: variable 'ansible_module_compression' from source: unknown 28983 1726883126.71252: variable 'ansible_shell_type' from source: unknown 28983 1726883126.71265: variable 'ansible_shell_executable' from source: unknown 28983 1726883126.71278: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883126.71290: variable 'ansible_pipelining' from source: unknown 28983 1726883126.71298: variable 'ansible_timeout' from source: unknown 28983 1726883126.71308: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883126.71489: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28983 1726883126.71591: variable 'omit' from source: magic vars 28983 1726883126.71594: starting attempt loop 28983 1726883126.71596: running the handler 28983 1726883126.71599: _low_level_execute_command(): starting 28983 1726883126.71601: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726883126.72405: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 28983 1726883126.72410: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883126.72475: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883126.72523: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883126.72546: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883126.72576: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883126.72703: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883126.74484: stdout chunk (state=3): >>>/root <<< 28983 1726883126.74712: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883126.74715: stdout chunk (state=3): >>><<< 28983 1726883126.74718: stderr chunk (state=3): >>><<< 28983 1726883126.74740: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883126.74760: _low_level_execute_command(): starting 28983 1726883126.74782: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726883126.7474747-34561-145707889072767 `" && echo ansible-tmp-1726883126.7474747-34561-145707889072767="` echo /root/.ansible/tmp/ansible-tmp-1726883126.7474747-34561-145707889072767 `" ) && sleep 0' 28983 1726883126.75518: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883126.75521: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883126.75524: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883126.75526: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883126.75589: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883126.75659: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883126.75748: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883126.77755: stdout chunk (state=3): >>>ansible-tmp-1726883126.7474747-34561-145707889072767=/root/.ansible/tmp/ansible-tmp-1726883126.7474747-34561-145707889072767 <<< 28983 1726883126.77875: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883126.77918: stderr chunk (state=3): >>><<< 28983 1726883126.77921: stdout chunk (state=3): >>><<< 28983 1726883126.77941: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726883126.7474747-34561-145707889072767=/root/.ansible/tmp/ansible-tmp-1726883126.7474747-34561-145707889072767 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883126.77981: variable 'ansible_module_compression' from source: unknown 28983 1726883126.78024: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 28983 1726883126.78066: variable 'ansible_facts' from source: unknown 28983 1726883126.78164: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726883126.7474747-34561-145707889072767/AnsiballZ_network_connections.py 28983 1726883126.78281: Sending initial data 28983 1726883126.78285: Sent initial data (168 bytes) 28983 1726883126.78720: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883126.78739: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726883126.78743: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883126.78831: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883126.78847: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883126.78874: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883126.78987: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883126.80619: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726883126.80684: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726883126.80760: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpl9xn7szr /root/.ansible/tmp/ansible-tmp-1726883126.7474747-34561-145707889072767/AnsiballZ_network_connections.py <<< 28983 1726883126.80763: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726883126.7474747-34561-145707889072767/AnsiballZ_network_connections.py" <<< 28983 1726883126.80825: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpl9xn7szr" to remote "/root/.ansible/tmp/ansible-tmp-1726883126.7474747-34561-145707889072767/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726883126.7474747-34561-145707889072767/AnsiballZ_network_connections.py" <<< 28983 1726883126.82063: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883126.82113: stderr chunk (state=3): >>><<< 28983 1726883126.82116: stdout chunk (state=3): >>><<< 28983 1726883126.82137: done transferring module to remote 28983 1726883126.82148: _low_level_execute_command(): starting 28983 1726883126.82151: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726883126.7474747-34561-145707889072767/ /root/.ansible/tmp/ansible-tmp-1726883126.7474747-34561-145707889072767/AnsiballZ_network_connections.py && sleep 0' 28983 1726883126.82686: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883126.82745: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883126.82833: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883126.84667: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883126.84710: stderr chunk (state=3): >>><<< 28983 1726883126.84714: stdout chunk (state=3): >>><<< 28983 1726883126.84727: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883126.84730: _low_level_execute_command(): starting 28983 1726883126.84737: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726883126.7474747-34561-145707889072767/AnsiballZ_network_connections.py && sleep 0' 28983 1726883126.85159: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883126.85162: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883126.85164: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883126.85167: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883126.85219: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883126.85227: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883126.85305: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883127.21317: stdout chunk (state=3): >>>Traceback (most recent call last): <<< 28983 1726883127.21363: stdout chunk (state=3): >>> File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_nj8pyz6f/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_nj8pyz6f/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on statebr/e447ed98-bcde-4ff6-b521-e956422e5e9a: error=unknown <<< 28983 1726883127.21554: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 28983 1726883127.23541: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. <<< 28983 1726883127.23609: stderr chunk (state=3): >>><<< 28983 1726883127.23613: stdout chunk (state=3): >>><<< 28983 1726883127.23631: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_nj8pyz6f/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_nj8pyz6f/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on statebr/e447ed98-bcde-4ff6-b521-e956422e5e9a: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726883127.23669: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726883126.7474747-34561-145707889072767/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726883127.23683: _low_level_execute_command(): starting 28983 1726883127.23689: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726883126.7474747-34561-145707889072767/ > /dev/null 2>&1 && sleep 0' 28983 1726883127.24191: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883127.24194: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883127.24197: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883127.24200: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883127.24202: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726883127.24204: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883127.24258: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883127.24263: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883127.24337: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883127.26328: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883127.26377: stderr chunk (state=3): >>><<< 28983 1726883127.26380: stdout chunk (state=3): >>><<< 28983 1726883127.26394: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883127.26402: handler run complete 28983 1726883127.26430: attempt loop complete, returning result 28983 1726883127.26435: _execute() done 28983 1726883127.26438: dumping result to json 28983 1726883127.26445: done dumping result, returning 28983 1726883127.26454: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affe814-3a2d-b16d-c0a7-0000000024b4] 28983 1726883127.26461: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000024b4 28983 1726883127.26580: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000024b4 28983 1726883127.26585: WORKER PROCESS EXITING changed: [managed_node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 28983 1726883127.26716: no more pending results, returning what we have 28983 1726883127.26720: results queue empty 28983 1726883127.26721: checking for any_errors_fatal 28983 1726883127.26729: done checking for any_errors_fatal 28983 1726883127.26730: checking for max_fail_percentage 28983 1726883127.26732: done checking for max_fail_percentage 28983 1726883127.26733: checking to see if all hosts have failed and the running result is not ok 28983 1726883127.26744: done checking to see if all hosts have failed 28983 1726883127.26745: getting the remaining hosts for this loop 28983 1726883127.26747: done getting the remaining hosts for this loop 28983 1726883127.26752: getting the next task for host managed_node2 28983 1726883127.26760: done getting next task for host managed_node2 28983 1726883127.26764: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 28983 1726883127.26771: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883127.26786: getting variables 28983 1726883127.26787: in VariableManager get_vars() 28983 1726883127.26836: Calling all_inventory to load vars for managed_node2 28983 1726883127.26840: Calling groups_inventory to load vars for managed_node2 28983 1726883127.26843: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883127.26862: Calling all_plugins_play to load vars for managed_node2 28983 1726883127.26865: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883127.26869: Calling groups_plugins_play to load vars for managed_node2 28983 1726883127.28205: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883127.29835: done with get_vars() 28983 1726883127.29859: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:45:27 -0400 (0:00:00.773) 0:02:37.297 ****** 28983 1726883127.29940: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 28983 1726883127.30191: worker is 1 (out of 1 available) 28983 1726883127.30207: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 28983 1726883127.30221: done queuing things up, now waiting for results queue to drain 28983 1726883127.30223: waiting for pending results... 28983 1726883127.30424: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 28983 1726883127.30553: in run() - task 0affe814-3a2d-b16d-c0a7-0000000024b5 28983 1726883127.30577: variable 'ansible_search_path' from source: unknown 28983 1726883127.30581: variable 'ansible_search_path' from source: unknown 28983 1726883127.30607: calling self._execute() 28983 1726883127.30696: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883127.30703: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883127.30715: variable 'omit' from source: magic vars 28983 1726883127.31049: variable 'ansible_distribution_major_version' from source: facts 28983 1726883127.31060: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883127.31168: variable 'network_state' from source: role '' defaults 28983 1726883127.31180: Evaluated conditional (network_state != {}): False 28983 1726883127.31183: when evaluation is False, skipping this task 28983 1726883127.31186: _execute() done 28983 1726883127.31190: dumping result to json 28983 1726883127.31195: done dumping result, returning 28983 1726883127.31203: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [0affe814-3a2d-b16d-c0a7-0000000024b5] 28983 1726883127.31209: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000024b5 28983 1726883127.31309: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000024b5 28983 1726883127.31313: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28983 1726883127.31381: no more pending results, returning what we have 28983 1726883127.31385: results queue empty 28983 1726883127.31386: checking for any_errors_fatal 28983 1726883127.31396: done checking for any_errors_fatal 28983 1726883127.31397: checking for max_fail_percentage 28983 1726883127.31399: done checking for max_fail_percentage 28983 1726883127.31400: checking to see if all hosts have failed and the running result is not ok 28983 1726883127.31401: done checking to see if all hosts have failed 28983 1726883127.31402: getting the remaining hosts for this loop 28983 1726883127.31404: done getting the remaining hosts for this loop 28983 1726883127.31409: getting the next task for host managed_node2 28983 1726883127.31417: done getting next task for host managed_node2 28983 1726883127.31421: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 28983 1726883127.31427: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883127.31454: getting variables 28983 1726883127.31455: in VariableManager get_vars() 28983 1726883127.31503: Calling all_inventory to load vars for managed_node2 28983 1726883127.31505: Calling groups_inventory to load vars for managed_node2 28983 1726883127.31507: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883127.31514: Calling all_plugins_play to load vars for managed_node2 28983 1726883127.31516: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883127.31518: Calling groups_plugins_play to load vars for managed_node2 28983 1726883127.32867: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883127.34475: done with get_vars() 28983 1726883127.34498: done getting variables 28983 1726883127.34551: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:45:27 -0400 (0:00:00.046) 0:02:37.343 ****** 28983 1726883127.34579: entering _queue_task() for managed_node2/debug 28983 1726883127.34794: worker is 1 (out of 1 available) 28983 1726883127.34809: exiting _queue_task() for managed_node2/debug 28983 1726883127.34823: done queuing things up, now waiting for results queue to drain 28983 1726883127.34825: waiting for pending results... 28983 1726883127.35031: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 28983 1726883127.35146: in run() - task 0affe814-3a2d-b16d-c0a7-0000000024b6 28983 1726883127.35160: variable 'ansible_search_path' from source: unknown 28983 1726883127.35167: variable 'ansible_search_path' from source: unknown 28983 1726883127.35202: calling self._execute() 28983 1726883127.35290: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883127.35298: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883127.35308: variable 'omit' from source: magic vars 28983 1726883127.35633: variable 'ansible_distribution_major_version' from source: facts 28983 1726883127.35645: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883127.35652: variable 'omit' from source: magic vars 28983 1726883127.35708: variable 'omit' from source: magic vars 28983 1726883127.35743: variable 'omit' from source: magic vars 28983 1726883127.35783: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883127.35814: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883127.35837: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883127.35853: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883127.35864: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883127.35894: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883127.35899: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883127.35902: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883127.35989: Set connection var ansible_connection to ssh 28983 1726883127.36000: Set connection var ansible_shell_executable to /bin/sh 28983 1726883127.36008: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883127.36017: Set connection var ansible_timeout to 10 28983 1726883127.36025: Set connection var ansible_pipelining to False 28983 1726883127.36028: Set connection var ansible_shell_type to sh 28983 1726883127.36054: variable 'ansible_shell_executable' from source: unknown 28983 1726883127.36058: variable 'ansible_connection' from source: unknown 28983 1726883127.36061: variable 'ansible_module_compression' from source: unknown 28983 1726883127.36063: variable 'ansible_shell_type' from source: unknown 28983 1726883127.36065: variable 'ansible_shell_executable' from source: unknown 28983 1726883127.36068: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883127.36072: variable 'ansible_pipelining' from source: unknown 28983 1726883127.36078: variable 'ansible_timeout' from source: unknown 28983 1726883127.36084: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883127.36207: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883127.36217: variable 'omit' from source: magic vars 28983 1726883127.36223: starting attempt loop 28983 1726883127.36226: running the handler 28983 1726883127.36337: variable '__network_connections_result' from source: set_fact 28983 1726883127.36388: handler run complete 28983 1726883127.36405: attempt loop complete, returning result 28983 1726883127.36408: _execute() done 28983 1726883127.36411: dumping result to json 28983 1726883127.36416: done dumping result, returning 28983 1726883127.36426: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affe814-3a2d-b16d-c0a7-0000000024b6] 28983 1726883127.36431: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000024b6 28983 1726883127.36522: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000024b6 28983 1726883127.36525: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result.stderr_lines": [ "" ] } 28983 1726883127.36604: no more pending results, returning what we have 28983 1726883127.36607: results queue empty 28983 1726883127.36608: checking for any_errors_fatal 28983 1726883127.36614: done checking for any_errors_fatal 28983 1726883127.36615: checking for max_fail_percentage 28983 1726883127.36617: done checking for max_fail_percentage 28983 1726883127.36618: checking to see if all hosts have failed and the running result is not ok 28983 1726883127.36619: done checking to see if all hosts have failed 28983 1726883127.36620: getting the remaining hosts for this loop 28983 1726883127.36622: done getting the remaining hosts for this loop 28983 1726883127.36626: getting the next task for host managed_node2 28983 1726883127.36636: done getting next task for host managed_node2 28983 1726883127.36640: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 28983 1726883127.36645: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883127.36658: getting variables 28983 1726883127.36659: in VariableManager get_vars() 28983 1726883127.36698: Calling all_inventory to load vars for managed_node2 28983 1726883127.36701: Calling groups_inventory to load vars for managed_node2 28983 1726883127.36703: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883127.36712: Calling all_plugins_play to load vars for managed_node2 28983 1726883127.36720: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883127.36724: Calling groups_plugins_play to load vars for managed_node2 28983 1726883127.38053: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883127.39676: done with get_vars() 28983 1726883127.39698: done getting variables 28983 1726883127.39744: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:45:27 -0400 (0:00:00.051) 0:02:37.395 ****** 28983 1726883127.39779: entering _queue_task() for managed_node2/debug 28983 1726883127.39985: worker is 1 (out of 1 available) 28983 1726883127.39998: exiting _queue_task() for managed_node2/debug 28983 1726883127.40011: done queuing things up, now waiting for results queue to drain 28983 1726883127.40013: waiting for pending results... 28983 1726883127.40213: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 28983 1726883127.40326: in run() - task 0affe814-3a2d-b16d-c0a7-0000000024b7 28983 1726883127.40339: variable 'ansible_search_path' from source: unknown 28983 1726883127.40343: variable 'ansible_search_path' from source: unknown 28983 1726883127.40378: calling self._execute() 28983 1726883127.40467: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883127.40473: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883127.40477: variable 'omit' from source: magic vars 28983 1726883127.40792: variable 'ansible_distribution_major_version' from source: facts 28983 1726883127.40803: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883127.40808: variable 'omit' from source: magic vars 28983 1726883127.40866: variable 'omit' from source: magic vars 28983 1726883127.40900: variable 'omit' from source: magic vars 28983 1726883127.40936: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883127.40966: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883127.40986: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883127.41004: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883127.41016: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883127.41047: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883127.41050: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883127.41055: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883127.41136: Set connection var ansible_connection to ssh 28983 1726883127.41149: Set connection var ansible_shell_executable to /bin/sh 28983 1726883127.41157: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883127.41166: Set connection var ansible_timeout to 10 28983 1726883127.41174: Set connection var ansible_pipelining to False 28983 1726883127.41177: Set connection var ansible_shell_type to sh 28983 1726883127.41194: variable 'ansible_shell_executable' from source: unknown 28983 1726883127.41197: variable 'ansible_connection' from source: unknown 28983 1726883127.41200: variable 'ansible_module_compression' from source: unknown 28983 1726883127.41205: variable 'ansible_shell_type' from source: unknown 28983 1726883127.41207: variable 'ansible_shell_executable' from source: unknown 28983 1726883127.41212: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883127.41217: variable 'ansible_pipelining' from source: unknown 28983 1726883127.41220: variable 'ansible_timeout' from source: unknown 28983 1726883127.41231: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883127.41354: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883127.41369: variable 'omit' from source: magic vars 28983 1726883127.41377: starting attempt loop 28983 1726883127.41380: running the handler 28983 1726883127.41423: variable '__network_connections_result' from source: set_fact 28983 1726883127.41491: variable '__network_connections_result' from source: set_fact 28983 1726883127.41591: handler run complete 28983 1726883127.41612: attempt loop complete, returning result 28983 1726883127.41615: _execute() done 28983 1726883127.41618: dumping result to json 28983 1726883127.41624: done dumping result, returning 28983 1726883127.41632: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affe814-3a2d-b16d-c0a7-0000000024b7] 28983 1726883127.41640: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000024b7 28983 1726883127.41741: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000024b7 28983 1726883127.41744: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 28983 1726883127.41877: no more pending results, returning what we have 28983 1726883127.41882: results queue empty 28983 1726883127.41883: checking for any_errors_fatal 28983 1726883127.41889: done checking for any_errors_fatal 28983 1726883127.41890: checking for max_fail_percentage 28983 1726883127.41892: done checking for max_fail_percentage 28983 1726883127.41893: checking to see if all hosts have failed and the running result is not ok 28983 1726883127.41894: done checking to see if all hosts have failed 28983 1726883127.41895: getting the remaining hosts for this loop 28983 1726883127.41897: done getting the remaining hosts for this loop 28983 1726883127.41900: getting the next task for host managed_node2 28983 1726883127.41908: done getting next task for host managed_node2 28983 1726883127.41910: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 28983 1726883127.41914: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883127.41924: getting variables 28983 1726883127.41925: in VariableManager get_vars() 28983 1726883127.41956: Calling all_inventory to load vars for managed_node2 28983 1726883127.41958: Calling groups_inventory to load vars for managed_node2 28983 1726883127.41960: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883127.41966: Calling all_plugins_play to load vars for managed_node2 28983 1726883127.41968: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883127.41975: Calling groups_plugins_play to load vars for managed_node2 28983 1726883127.43173: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883127.44775: done with get_vars() 28983 1726883127.44797: done getting variables 28983 1726883127.44845: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:45:27 -0400 (0:00:00.050) 0:02:37.446 ****** 28983 1726883127.44873: entering _queue_task() for managed_node2/debug 28983 1726883127.45076: worker is 1 (out of 1 available) 28983 1726883127.45091: exiting _queue_task() for managed_node2/debug 28983 1726883127.45103: done queuing things up, now waiting for results queue to drain 28983 1726883127.45105: waiting for pending results... 28983 1726883127.45298: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 28983 1726883127.45411: in run() - task 0affe814-3a2d-b16d-c0a7-0000000024b8 28983 1726883127.45423: variable 'ansible_search_path' from source: unknown 28983 1726883127.45427: variable 'ansible_search_path' from source: unknown 28983 1726883127.45463: calling self._execute() 28983 1726883127.45548: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883127.45552: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883127.45568: variable 'omit' from source: magic vars 28983 1726883127.45874: variable 'ansible_distribution_major_version' from source: facts 28983 1726883127.45895: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883127.45995: variable 'network_state' from source: role '' defaults 28983 1726883127.46009: Evaluated conditional (network_state != {}): False 28983 1726883127.46012: when evaluation is False, skipping this task 28983 1726883127.46015: _execute() done 28983 1726883127.46018: dumping result to json 28983 1726883127.46022: done dumping result, returning 28983 1726883127.46029: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affe814-3a2d-b16d-c0a7-0000000024b8] 28983 1726883127.46036: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000024b8 28983 1726883127.46135: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000024b8 28983 1726883127.46139: WORKER PROCESS EXITING skipping: [managed_node2] => { "false_condition": "network_state != {}" } 28983 1726883127.46193: no more pending results, returning what we have 28983 1726883127.46197: results queue empty 28983 1726883127.46198: checking for any_errors_fatal 28983 1726883127.46205: done checking for any_errors_fatal 28983 1726883127.46206: checking for max_fail_percentage 28983 1726883127.46208: done checking for max_fail_percentage 28983 1726883127.46209: checking to see if all hosts have failed and the running result is not ok 28983 1726883127.46210: done checking to see if all hosts have failed 28983 1726883127.46211: getting the remaining hosts for this loop 28983 1726883127.46213: done getting the remaining hosts for this loop 28983 1726883127.46217: getting the next task for host managed_node2 28983 1726883127.46224: done getting next task for host managed_node2 28983 1726883127.46229: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 28983 1726883127.46236: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883127.46261: getting variables 28983 1726883127.46262: in VariableManager get_vars() 28983 1726883127.46301: Calling all_inventory to load vars for managed_node2 28983 1726883127.46304: Calling groups_inventory to load vars for managed_node2 28983 1726883127.46305: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883127.46311: Calling all_plugins_play to load vars for managed_node2 28983 1726883127.46314: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883127.46316: Calling groups_plugins_play to load vars for managed_node2 28983 1726883127.47637: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883127.49220: done with get_vars() 28983 1726883127.49247: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:45:27 -0400 (0:00:00.044) 0:02:37.490 ****** 28983 1726883127.49320: entering _queue_task() for managed_node2/ping 28983 1726883127.49538: worker is 1 (out of 1 available) 28983 1726883127.49551: exiting _queue_task() for managed_node2/ping 28983 1726883127.49565: done queuing things up, now waiting for results queue to drain 28983 1726883127.49567: waiting for pending results... 28983 1726883127.49761: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 28983 1726883127.49880: in run() - task 0affe814-3a2d-b16d-c0a7-0000000024b9 28983 1726883127.49894: variable 'ansible_search_path' from source: unknown 28983 1726883127.49900: variable 'ansible_search_path' from source: unknown 28983 1726883127.49930: calling self._execute() 28983 1726883127.50020: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883127.50026: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883127.50040: variable 'omit' from source: magic vars 28983 1726883127.50355: variable 'ansible_distribution_major_version' from source: facts 28983 1726883127.50367: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883127.50376: variable 'omit' from source: magic vars 28983 1726883127.50429: variable 'omit' from source: magic vars 28983 1726883127.50465: variable 'omit' from source: magic vars 28983 1726883127.50500: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883127.50530: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883127.50550: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883127.50571: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883127.50583: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883127.50610: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883127.50613: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883127.50618: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883127.50703: Set connection var ansible_connection to ssh 28983 1726883127.50714: Set connection var ansible_shell_executable to /bin/sh 28983 1726883127.50722: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883127.50731: Set connection var ansible_timeout to 10 28983 1726883127.50739: Set connection var ansible_pipelining to False 28983 1726883127.50742: Set connection var ansible_shell_type to sh 28983 1726883127.50761: variable 'ansible_shell_executable' from source: unknown 28983 1726883127.50764: variable 'ansible_connection' from source: unknown 28983 1726883127.50768: variable 'ansible_module_compression' from source: unknown 28983 1726883127.50772: variable 'ansible_shell_type' from source: unknown 28983 1726883127.50779: variable 'ansible_shell_executable' from source: unknown 28983 1726883127.50783: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883127.50789: variable 'ansible_pipelining' from source: unknown 28983 1726883127.50793: variable 'ansible_timeout' from source: unknown 28983 1726883127.50802: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883127.50973: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28983 1726883127.50986: variable 'omit' from source: magic vars 28983 1726883127.50993: starting attempt loop 28983 1726883127.50996: running the handler 28983 1726883127.51013: _low_level_execute_command(): starting 28983 1726883127.51022: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726883127.51566: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883127.51572: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883127.51576: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883127.51638: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883127.51641: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883127.51644: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883127.51725: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883127.53498: stdout chunk (state=3): >>>/root <<< 28983 1726883127.53612: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883127.53666: stderr chunk (state=3): >>><<< 28983 1726883127.53669: stdout chunk (state=3): >>><<< 28983 1726883127.53693: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883127.53707: _low_level_execute_command(): starting 28983 1726883127.53713: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726883127.5369437-34590-209035804763146 `" && echo ansible-tmp-1726883127.5369437-34590-209035804763146="` echo /root/.ansible/tmp/ansible-tmp-1726883127.5369437-34590-209035804763146 `" ) && sleep 0' 28983 1726883127.54181: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883127.54186: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726883127.54189: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration <<< 28983 1726883127.54198: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found <<< 28983 1726883127.54201: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883127.54246: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883127.54253: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883127.54329: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883127.56322: stdout chunk (state=3): >>>ansible-tmp-1726883127.5369437-34590-209035804763146=/root/.ansible/tmp/ansible-tmp-1726883127.5369437-34590-209035804763146 <<< 28983 1726883127.56444: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883127.56498: stderr chunk (state=3): >>><<< 28983 1726883127.56502: stdout chunk (state=3): >>><<< 28983 1726883127.56516: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726883127.5369437-34590-209035804763146=/root/.ansible/tmp/ansible-tmp-1726883127.5369437-34590-209035804763146 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883127.56557: variable 'ansible_module_compression' from source: unknown 28983 1726883127.56597: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 28983 1726883127.56626: variable 'ansible_facts' from source: unknown 28983 1726883127.56686: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726883127.5369437-34590-209035804763146/AnsiballZ_ping.py 28983 1726883127.56794: Sending initial data 28983 1726883127.56798: Sent initial data (153 bytes) 28983 1726883127.57272: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883127.57276: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883127.57279: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883127.57282: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883127.57332: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883127.57341: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883127.57409: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883127.59038: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 28983 1726883127.59042: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726883127.59102: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726883127.59169: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmp2c7cqrlm /root/.ansible/tmp/ansible-tmp-1726883127.5369437-34590-209035804763146/AnsiballZ_ping.py <<< 28983 1726883127.59174: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726883127.5369437-34590-209035804763146/AnsiballZ_ping.py" <<< 28983 1726883127.59241: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmp2c7cqrlm" to remote "/root/.ansible/tmp/ansible-tmp-1726883127.5369437-34590-209035804763146/AnsiballZ_ping.py" <<< 28983 1726883127.59244: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726883127.5369437-34590-209035804763146/AnsiballZ_ping.py" <<< 28983 1726883127.60115: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883127.60178: stderr chunk (state=3): >>><<< 28983 1726883127.60185: stdout chunk (state=3): >>><<< 28983 1726883127.60205: done transferring module to remote 28983 1726883127.60216: _low_level_execute_command(): starting 28983 1726883127.60222: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726883127.5369437-34590-209035804763146/ /root/.ansible/tmp/ansible-tmp-1726883127.5369437-34590-209035804763146/AnsiballZ_ping.py && sleep 0' 28983 1726883127.60678: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883127.60682: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726883127.60684: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 28983 1726883127.60687: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883127.60752: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883127.60755: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883127.60823: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883127.62693: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883127.62739: stderr chunk (state=3): >>><<< 28983 1726883127.62743: stdout chunk (state=3): >>><<< 28983 1726883127.62759: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883127.62762: _low_level_execute_command(): starting 28983 1726883127.62769: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726883127.5369437-34590-209035804763146/AnsiballZ_ping.py && sleep 0' 28983 1726883127.63219: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883127.63223: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883127.63225: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883127.63227: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883127.63288: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883127.63291: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883127.63366: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883127.80224: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 28983 1726883127.81659: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. <<< 28983 1726883127.81724: stderr chunk (state=3): >>><<< 28983 1726883127.81728: stdout chunk (state=3): >>><<< 28983 1726883127.81749: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726883127.81779: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726883127.5369437-34590-209035804763146/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726883127.81792: _low_level_execute_command(): starting 28983 1726883127.81796: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726883127.5369437-34590-209035804763146/ > /dev/null 2>&1 && sleep 0' 28983 1726883127.82296: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883127.82300: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726883127.82302: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883127.82304: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883127.82307: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883127.82361: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883127.82365: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883127.82444: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883127.84355: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883127.84406: stderr chunk (state=3): >>><<< 28983 1726883127.84409: stdout chunk (state=3): >>><<< 28983 1726883127.84425: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883127.84436: handler run complete 28983 1726883127.84450: attempt loop complete, returning result 28983 1726883127.84453: _execute() done 28983 1726883127.84458: dumping result to json 28983 1726883127.84462: done dumping result, returning 28983 1726883127.84474: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affe814-3a2d-b16d-c0a7-0000000024b9] 28983 1726883127.84479: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000024b9 28983 1726883127.84580: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000024b9 28983 1726883127.84583: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "ping": "pong" } 28983 1726883127.84678: no more pending results, returning what we have 28983 1726883127.84682: results queue empty 28983 1726883127.84683: checking for any_errors_fatal 28983 1726883127.84694: done checking for any_errors_fatal 28983 1726883127.84695: checking for max_fail_percentage 28983 1726883127.84696: done checking for max_fail_percentage 28983 1726883127.84698: checking to see if all hosts have failed and the running result is not ok 28983 1726883127.84699: done checking to see if all hosts have failed 28983 1726883127.84699: getting the remaining hosts for this loop 28983 1726883127.84702: done getting the remaining hosts for this loop 28983 1726883127.84707: getting the next task for host managed_node2 28983 1726883127.84719: done getting next task for host managed_node2 28983 1726883127.84722: ^ task is: TASK: meta (role_complete) 28983 1726883127.84728: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883127.84745: getting variables 28983 1726883127.84747: in VariableManager get_vars() 28983 1726883127.84800: Calling all_inventory to load vars for managed_node2 28983 1726883127.84803: Calling groups_inventory to load vars for managed_node2 28983 1726883127.84806: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883127.84815: Calling all_plugins_play to load vars for managed_node2 28983 1726883127.84818: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883127.84822: Calling groups_plugins_play to load vars for managed_node2 28983 1726883127.91351: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883127.93263: done with get_vars() 28983 1726883127.93294: done getting variables 28983 1726883127.93356: done queuing things up, now waiting for results queue to drain 28983 1726883127.93358: results queue empty 28983 1726883127.93358: checking for any_errors_fatal 28983 1726883127.93361: done checking for any_errors_fatal 28983 1726883127.93362: checking for max_fail_percentage 28983 1726883127.93362: done checking for max_fail_percentage 28983 1726883127.93363: checking to see if all hosts have failed and the running result is not ok 28983 1726883127.93364: done checking to see if all hosts have failed 28983 1726883127.93365: getting the remaining hosts for this loop 28983 1726883127.93366: done getting the remaining hosts for this loop 28983 1726883127.93369: getting the next task for host managed_node2 28983 1726883127.93374: done getting next task for host managed_node2 28983 1726883127.93376: ^ task is: TASK: Test 28983 1726883127.93377: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883127.93379: getting variables 28983 1726883127.93380: in VariableManager get_vars() 28983 1726883127.93392: Calling all_inventory to load vars for managed_node2 28983 1726883127.93394: Calling groups_inventory to load vars for managed_node2 28983 1726883127.93396: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883127.93400: Calling all_plugins_play to load vars for managed_node2 28983 1726883127.93402: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883127.93404: Calling groups_plugins_play to load vars for managed_node2 28983 1726883127.94868: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883127.96909: done with get_vars() 28983 1726883127.96931: done getting variables TASK [Test] ******************************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:30 Friday 20 September 2024 21:45:27 -0400 (0:00:00.476) 0:02:37.967 ****** 28983 1726883127.96995: entering _queue_task() for managed_node2/include_tasks 28983 1726883127.97288: worker is 1 (out of 1 available) 28983 1726883127.97304: exiting _queue_task() for managed_node2/include_tasks 28983 1726883127.97319: done queuing things up, now waiting for results queue to drain 28983 1726883127.97322: waiting for pending results... 28983 1726883127.97539: running TaskExecutor() for managed_node2/TASK: Test 28983 1726883127.97640: in run() - task 0affe814-3a2d-b16d-c0a7-0000000020b1 28983 1726883127.97656: variable 'ansible_search_path' from source: unknown 28983 1726883127.97659: variable 'ansible_search_path' from source: unknown 28983 1726883127.97707: variable 'lsr_test' from source: include params 28983 1726883127.97904: variable 'lsr_test' from source: include params 28983 1726883127.97966: variable 'omit' from source: magic vars 28983 1726883127.98094: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883127.98108: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883127.98119: variable 'omit' from source: magic vars 28983 1726883127.98347: variable 'ansible_distribution_major_version' from source: facts 28983 1726883127.98357: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883127.98363: variable 'item' from source: unknown 28983 1726883127.98425: variable 'item' from source: unknown 28983 1726883127.98460: variable 'item' from source: unknown 28983 1726883127.98516: variable 'item' from source: unknown 28983 1726883127.98678: dumping result to json 28983 1726883127.98681: done dumping result, returning 28983 1726883127.98683: done running TaskExecutor() for managed_node2/TASK: Test [0affe814-3a2d-b16d-c0a7-0000000020b1] 28983 1726883127.98686: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000020b1 28983 1726883127.98725: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000020b1 28983 1726883127.98728: WORKER PROCESS EXITING 28983 1726883127.98765: no more pending results, returning what we have 28983 1726883127.98770: in VariableManager get_vars() 28983 1726883127.98820: Calling all_inventory to load vars for managed_node2 28983 1726883127.98823: Calling groups_inventory to load vars for managed_node2 28983 1726883127.98827: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883127.98854: Calling all_plugins_play to load vars for managed_node2 28983 1726883127.98859: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883127.98866: Calling groups_plugins_play to load vars for managed_node2 28983 1726883128.01308: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883128.04400: done with get_vars() 28983 1726883128.04437: variable 'ansible_search_path' from source: unknown 28983 1726883128.04439: variable 'ansible_search_path' from source: unknown 28983 1726883128.04495: we have included files to process 28983 1726883128.04497: generating all_blocks data 28983 1726883128.04499: done generating all_blocks data 28983 1726883128.04506: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml 28983 1726883128.04508: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml 28983 1726883128.04511: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml 28983 1726883128.04694: done processing included file 28983 1726883128.04697: iterating over new_blocks loaded from include file 28983 1726883128.04699: in VariableManager get_vars() 28983 1726883128.04723: done with get_vars() 28983 1726883128.04725: filtering new block on tags 28983 1726883128.04762: done filtering new block on tags 28983 1726883128.04765: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml for managed_node2 => (item=tasks/remove+down_profile.yml) 28983 1726883128.04771: extending task lists for all hosts with included blocks 28983 1726883128.06256: done extending task lists 28983 1726883128.06258: done processing included files 28983 1726883128.06259: results queue empty 28983 1726883128.06260: checking for any_errors_fatal 28983 1726883128.06262: done checking for any_errors_fatal 28983 1726883128.06263: checking for max_fail_percentage 28983 1726883128.06264: done checking for max_fail_percentage 28983 1726883128.06265: checking to see if all hosts have failed and the running result is not ok 28983 1726883128.06266: done checking to see if all hosts have failed 28983 1726883128.06267: getting the remaining hosts for this loop 28983 1726883128.06269: done getting the remaining hosts for this loop 28983 1726883128.06272: getting the next task for host managed_node2 28983 1726883128.06276: done getting next task for host managed_node2 28983 1726883128.06278: ^ task is: TASK: Include network role 28983 1726883128.06282: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883128.06285: getting variables 28983 1726883128.06286: in VariableManager get_vars() 28983 1726883128.06300: Calling all_inventory to load vars for managed_node2 28983 1726883128.06302: Calling groups_inventory to load vars for managed_node2 28983 1726883128.06305: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883128.06311: Calling all_plugins_play to load vars for managed_node2 28983 1726883128.06314: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883128.06326: Calling groups_plugins_play to load vars for managed_node2 28983 1726883128.08551: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883128.11581: done with get_vars() 28983 1726883128.11615: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml:3 Friday 20 September 2024 21:45:28 -0400 (0:00:00.147) 0:02:38.115 ****** 28983 1726883128.11740: entering _queue_task() for managed_node2/include_role 28983 1726883128.12175: worker is 1 (out of 1 available) 28983 1726883128.12188: exiting _queue_task() for managed_node2/include_role 28983 1726883128.12201: done queuing things up, now waiting for results queue to drain 28983 1726883128.12202: waiting for pending results... 28983 1726883128.12671: running TaskExecutor() for managed_node2/TASK: Include network role 28983 1726883128.12677: in run() - task 0affe814-3a2d-b16d-c0a7-000000002612 28983 1726883128.12694: variable 'ansible_search_path' from source: unknown 28983 1726883128.12703: variable 'ansible_search_path' from source: unknown 28983 1726883128.12752: calling self._execute() 28983 1726883128.12889: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883128.12905: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883128.12925: variable 'omit' from source: magic vars 28983 1726883128.13437: variable 'ansible_distribution_major_version' from source: facts 28983 1726883128.13458: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883128.13469: _execute() done 28983 1726883128.13478: dumping result to json 28983 1726883128.13487: done dumping result, returning 28983 1726883128.13499: done running TaskExecutor() for managed_node2/TASK: Include network role [0affe814-3a2d-b16d-c0a7-000000002612] 28983 1726883128.13511: sending task result for task 0affe814-3a2d-b16d-c0a7-000000002612 28983 1726883128.13724: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000002612 28983 1726883128.13729: WORKER PROCESS EXITING 28983 1726883128.13782: no more pending results, returning what we have 28983 1726883128.13789: in VariableManager get_vars() 28983 1726883128.13858: Calling all_inventory to load vars for managed_node2 28983 1726883128.13862: Calling groups_inventory to load vars for managed_node2 28983 1726883128.13867: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883128.13883: Calling all_plugins_play to load vars for managed_node2 28983 1726883128.13888: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883128.13892: Calling groups_plugins_play to load vars for managed_node2 28983 1726883128.16022: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883128.17648: done with get_vars() 28983 1726883128.17667: variable 'ansible_search_path' from source: unknown 28983 1726883128.17668: variable 'ansible_search_path' from source: unknown 28983 1726883128.17783: variable 'omit' from source: magic vars 28983 1726883128.17816: variable 'omit' from source: magic vars 28983 1726883128.17828: variable 'omit' from source: magic vars 28983 1726883128.17831: we have included files to process 28983 1726883128.17831: generating all_blocks data 28983 1726883128.17833: done generating all_blocks data 28983 1726883128.17836: processing included file: fedora.linux_system_roles.network 28983 1726883128.17854: in VariableManager get_vars() 28983 1726883128.17867: done with get_vars() 28983 1726883128.17890: in VariableManager get_vars() 28983 1726883128.17907: done with get_vars() 28983 1726883128.17940: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 28983 1726883128.18061: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 28983 1726883128.18149: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 28983 1726883128.18863: in VariableManager get_vars() 28983 1726883128.18892: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 28983 1726883128.20680: iterating over new_blocks loaded from include file 28983 1726883128.20682: in VariableManager get_vars() 28983 1726883128.20696: done with get_vars() 28983 1726883128.20697: filtering new block on tags 28983 1726883128.20932: done filtering new block on tags 28983 1726883128.20937: in VariableManager get_vars() 28983 1726883128.20953: done with get_vars() 28983 1726883128.20955: filtering new block on tags 28983 1726883128.20970: done filtering new block on tags 28983 1726883128.20972: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node2 28983 1726883128.20976: extending task lists for all hosts with included blocks 28983 1726883128.21062: done extending task lists 28983 1726883128.21063: done processing included files 28983 1726883128.21064: results queue empty 28983 1726883128.21065: checking for any_errors_fatal 28983 1726883128.21068: done checking for any_errors_fatal 28983 1726883128.21069: checking for max_fail_percentage 28983 1726883128.21070: done checking for max_fail_percentage 28983 1726883128.21071: checking to see if all hosts have failed and the running result is not ok 28983 1726883128.21072: done checking to see if all hosts have failed 28983 1726883128.21072: getting the remaining hosts for this loop 28983 1726883128.21073: done getting the remaining hosts for this loop 28983 1726883128.21076: getting the next task for host managed_node2 28983 1726883128.21079: done getting next task for host managed_node2 28983 1726883128.21081: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 28983 1726883128.21084: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883128.21093: getting variables 28983 1726883128.21094: in VariableManager get_vars() 28983 1726883128.21104: Calling all_inventory to load vars for managed_node2 28983 1726883128.21106: Calling groups_inventory to load vars for managed_node2 28983 1726883128.21108: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883128.21112: Calling all_plugins_play to load vars for managed_node2 28983 1726883128.21114: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883128.21116: Calling groups_plugins_play to load vars for managed_node2 28983 1726883128.22204: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883128.24966: done with get_vars() 28983 1726883128.24999: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:45:28 -0400 (0:00:00.133) 0:02:38.248 ****** 28983 1726883128.25086: entering _queue_task() for managed_node2/include_tasks 28983 1726883128.25458: worker is 1 (out of 1 available) 28983 1726883128.25472: exiting _queue_task() for managed_node2/include_tasks 28983 1726883128.25486: done queuing things up, now waiting for results queue to drain 28983 1726883128.25488: waiting for pending results... 28983 1726883128.25956: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 28983 1726883128.25962: in run() - task 0affe814-3a2d-b16d-c0a7-000000002694 28983 1726883128.25966: variable 'ansible_search_path' from source: unknown 28983 1726883128.25969: variable 'ansible_search_path' from source: unknown 28983 1726883128.26022: calling self._execute() 28983 1726883128.26151: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883128.26158: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883128.26176: variable 'omit' from source: magic vars 28983 1726883128.26703: variable 'ansible_distribution_major_version' from source: facts 28983 1726883128.26717: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883128.26823: _execute() done 28983 1726883128.26826: dumping result to json 28983 1726883128.26828: done dumping result, returning 28983 1726883128.26830: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affe814-3a2d-b16d-c0a7-000000002694] 28983 1726883128.26832: sending task result for task 0affe814-3a2d-b16d-c0a7-000000002694 28983 1726883128.26905: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000002694 28983 1726883128.26908: WORKER PROCESS EXITING 28983 1726883128.26987: no more pending results, returning what we have 28983 1726883128.26992: in VariableManager get_vars() 28983 1726883128.27043: Calling all_inventory to load vars for managed_node2 28983 1726883128.27047: Calling groups_inventory to load vars for managed_node2 28983 1726883128.27050: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883128.27059: Calling all_plugins_play to load vars for managed_node2 28983 1726883128.27062: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883128.27065: Calling groups_plugins_play to load vars for managed_node2 28983 1726883128.29618: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883128.33005: done with get_vars() 28983 1726883128.33039: variable 'ansible_search_path' from source: unknown 28983 1726883128.33041: variable 'ansible_search_path' from source: unknown 28983 1726883128.33104: we have included files to process 28983 1726883128.33105: generating all_blocks data 28983 1726883128.33107: done generating all_blocks data 28983 1726883128.33111: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 28983 1726883128.33112: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 28983 1726883128.33115: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 28983 1726883128.33943: done processing included file 28983 1726883128.33945: iterating over new_blocks loaded from include file 28983 1726883128.33947: in VariableManager get_vars() 28983 1726883128.33996: done with get_vars() 28983 1726883128.33998: filtering new block on tags 28983 1726883128.34043: done filtering new block on tags 28983 1726883128.34047: in VariableManager get_vars() 28983 1726883128.34095: done with get_vars() 28983 1726883128.34098: filtering new block on tags 28983 1726883128.34166: done filtering new block on tags 28983 1726883128.34169: in VariableManager get_vars() 28983 1726883128.34217: done with get_vars() 28983 1726883128.34219: filtering new block on tags 28983 1726883128.34299: done filtering new block on tags 28983 1726883128.34302: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node2 28983 1726883128.34309: extending task lists for all hosts with included blocks 28983 1726883128.37503: done extending task lists 28983 1726883128.37505: done processing included files 28983 1726883128.37506: results queue empty 28983 1726883128.37507: checking for any_errors_fatal 28983 1726883128.37510: done checking for any_errors_fatal 28983 1726883128.37511: checking for max_fail_percentage 28983 1726883128.37513: done checking for max_fail_percentage 28983 1726883128.37514: checking to see if all hosts have failed and the running result is not ok 28983 1726883128.37515: done checking to see if all hosts have failed 28983 1726883128.37516: getting the remaining hosts for this loop 28983 1726883128.37517: done getting the remaining hosts for this loop 28983 1726883128.37521: getting the next task for host managed_node2 28983 1726883128.37527: done getting next task for host managed_node2 28983 1726883128.37530: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 28983 1726883128.37537: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883128.37563: getting variables 28983 1726883128.37565: in VariableManager get_vars() 28983 1726883128.37587: Calling all_inventory to load vars for managed_node2 28983 1726883128.37590: Calling groups_inventory to load vars for managed_node2 28983 1726883128.37593: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883128.37600: Calling all_plugins_play to load vars for managed_node2 28983 1726883128.37603: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883128.37607: Calling groups_plugins_play to load vars for managed_node2 28983 1726883128.40882: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883128.45519: done with get_vars() 28983 1726883128.45569: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:45:28 -0400 (0:00:00.205) 0:02:38.454 ****** 28983 1726883128.45683: entering _queue_task() for managed_node2/setup 28983 1726883128.46162: worker is 1 (out of 1 available) 28983 1726883128.46177: exiting _queue_task() for managed_node2/setup 28983 1726883128.46303: done queuing things up, now waiting for results queue to drain 28983 1726883128.46306: waiting for pending results... 28983 1726883128.46511: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 28983 1726883128.46842: in run() - task 0affe814-3a2d-b16d-c0a7-0000000026eb 28983 1726883128.46866: variable 'ansible_search_path' from source: unknown 28983 1726883128.46882: variable 'ansible_search_path' from source: unknown 28983 1726883128.46922: calling self._execute() 28983 1726883128.47037: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883128.47095: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883128.47108: variable 'omit' from source: magic vars 28983 1726883128.47744: variable 'ansible_distribution_major_version' from source: facts 28983 1726883128.47749: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883128.48000: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883128.50338: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883128.50401: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883128.50432: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883128.50467: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883128.50494: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883128.50565: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883128.50593: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883128.50615: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883128.50649: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883128.50662: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883128.50713: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883128.50733: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883128.50756: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883128.50790: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883128.50806: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883128.50947: variable '__network_required_facts' from source: role '' defaults 28983 1726883128.50956: variable 'ansible_facts' from source: unknown 28983 1726883128.52342: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 28983 1726883128.52346: when evaluation is False, skipping this task 28983 1726883128.52349: _execute() done 28983 1726883128.52351: dumping result to json 28983 1726883128.52354: done dumping result, returning 28983 1726883128.52357: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affe814-3a2d-b16d-c0a7-0000000026eb] 28983 1726883128.52359: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000026eb 28983 1726883128.52516: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000026eb 28983 1726883128.52519: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28983 1726883128.52589: no more pending results, returning what we have 28983 1726883128.52594: results queue empty 28983 1726883128.52595: checking for any_errors_fatal 28983 1726883128.52597: done checking for any_errors_fatal 28983 1726883128.52598: checking for max_fail_percentage 28983 1726883128.52601: done checking for max_fail_percentage 28983 1726883128.52602: checking to see if all hosts have failed and the running result is not ok 28983 1726883128.52603: done checking to see if all hosts have failed 28983 1726883128.52604: getting the remaining hosts for this loop 28983 1726883128.52606: done getting the remaining hosts for this loop 28983 1726883128.52612: getting the next task for host managed_node2 28983 1726883128.52625: done getting next task for host managed_node2 28983 1726883128.52630: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 28983 1726883128.52639: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883128.52681: getting variables 28983 1726883128.52684: in VariableManager get_vars() 28983 1726883128.52943: Calling all_inventory to load vars for managed_node2 28983 1726883128.52947: Calling groups_inventory to load vars for managed_node2 28983 1726883128.52951: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883128.52970: Calling all_plugins_play to load vars for managed_node2 28983 1726883128.52975: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883128.52985: Calling groups_plugins_play to load vars for managed_node2 28983 1726883128.55335: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883128.58346: done with get_vars() 28983 1726883128.58393: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:45:28 -0400 (0:00:00.128) 0:02:38.583 ****** 28983 1726883128.58582: entering _queue_task() for managed_node2/stat 28983 1726883128.58962: worker is 1 (out of 1 available) 28983 1726883128.58975: exiting _queue_task() for managed_node2/stat 28983 1726883128.58990: done queuing things up, now waiting for results queue to drain 28983 1726883128.58992: waiting for pending results... 28983 1726883128.59209: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 28983 1726883128.59350: in run() - task 0affe814-3a2d-b16d-c0a7-0000000026ed 28983 1726883128.59366: variable 'ansible_search_path' from source: unknown 28983 1726883128.59373: variable 'ansible_search_path' from source: unknown 28983 1726883128.59404: calling self._execute() 28983 1726883128.59494: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883128.59499: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883128.59508: variable 'omit' from source: magic vars 28983 1726883128.59840: variable 'ansible_distribution_major_version' from source: facts 28983 1726883128.59851: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883128.59997: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883128.60228: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883128.60274: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883128.60304: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883128.60338: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883128.60409: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883128.60430: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883128.60454: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883128.60482: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883128.60567: variable '__network_is_ostree' from source: set_fact 28983 1726883128.60579: Evaluated conditional (not __network_is_ostree is defined): False 28983 1726883128.60584: when evaluation is False, skipping this task 28983 1726883128.60587: _execute() done 28983 1726883128.60590: dumping result to json 28983 1726883128.60592: done dumping result, returning 28983 1726883128.60595: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affe814-3a2d-b16d-c0a7-0000000026ed] 28983 1726883128.60603: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000026ed 28983 1726883128.60698: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000026ed 28983 1726883128.60701: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 28983 1726883128.60761: no more pending results, returning what we have 28983 1726883128.60765: results queue empty 28983 1726883128.60766: checking for any_errors_fatal 28983 1726883128.60776: done checking for any_errors_fatal 28983 1726883128.60777: checking for max_fail_percentage 28983 1726883128.60779: done checking for max_fail_percentage 28983 1726883128.60780: checking to see if all hosts have failed and the running result is not ok 28983 1726883128.60781: done checking to see if all hosts have failed 28983 1726883128.60782: getting the remaining hosts for this loop 28983 1726883128.60784: done getting the remaining hosts for this loop 28983 1726883128.60789: getting the next task for host managed_node2 28983 1726883128.60798: done getting next task for host managed_node2 28983 1726883128.60802: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 28983 1726883128.60809: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883128.60846: getting variables 28983 1726883128.60848: in VariableManager get_vars() 28983 1726883128.60894: Calling all_inventory to load vars for managed_node2 28983 1726883128.60897: Calling groups_inventory to load vars for managed_node2 28983 1726883128.60899: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883128.60908: Calling all_plugins_play to load vars for managed_node2 28983 1726883128.60911: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883128.60915: Calling groups_plugins_play to load vars for managed_node2 28983 1726883128.62617: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883128.64280: done with get_vars() 28983 1726883128.64303: done getting variables 28983 1726883128.64352: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:45:28 -0400 (0:00:00.058) 0:02:38.641 ****** 28983 1726883128.64390: entering _queue_task() for managed_node2/set_fact 28983 1726883128.64649: worker is 1 (out of 1 available) 28983 1726883128.64664: exiting _queue_task() for managed_node2/set_fact 28983 1726883128.64680: done queuing things up, now waiting for results queue to drain 28983 1726883128.64683: waiting for pending results... 28983 1726883128.64887: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 28983 1726883128.65016: in run() - task 0affe814-3a2d-b16d-c0a7-0000000026ee 28983 1726883128.65032: variable 'ansible_search_path' from source: unknown 28983 1726883128.65039: variable 'ansible_search_path' from source: unknown 28983 1726883128.65069: calling self._execute() 28983 1726883128.65156: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883128.65163: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883128.65176: variable 'omit' from source: magic vars 28983 1726883128.65510: variable 'ansible_distribution_major_version' from source: facts 28983 1726883128.65521: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883128.65668: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883128.65904: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883128.65944: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883128.65977: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883128.66006: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883128.66090: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883128.66111: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883128.66147: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883128.66170: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883128.66256: variable '__network_is_ostree' from source: set_fact 28983 1726883128.66263: Evaluated conditional (not __network_is_ostree is defined): False 28983 1726883128.66266: when evaluation is False, skipping this task 28983 1726883128.66268: _execute() done 28983 1726883128.66276: dumping result to json 28983 1726883128.66281: done dumping result, returning 28983 1726883128.66289: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affe814-3a2d-b16d-c0a7-0000000026ee] 28983 1726883128.66295: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000026ee 28983 1726883128.66388: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000026ee 28983 1726883128.66392: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 28983 1726883128.66451: no more pending results, returning what we have 28983 1726883128.66455: results queue empty 28983 1726883128.66456: checking for any_errors_fatal 28983 1726883128.66463: done checking for any_errors_fatal 28983 1726883128.66463: checking for max_fail_percentage 28983 1726883128.66465: done checking for max_fail_percentage 28983 1726883128.66466: checking to see if all hosts have failed and the running result is not ok 28983 1726883128.66468: done checking to see if all hosts have failed 28983 1726883128.66469: getting the remaining hosts for this loop 28983 1726883128.66471: done getting the remaining hosts for this loop 28983 1726883128.66476: getting the next task for host managed_node2 28983 1726883128.66490: done getting next task for host managed_node2 28983 1726883128.66494: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 28983 1726883128.66502: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883128.66530: getting variables 28983 1726883128.66532: in VariableManager get_vars() 28983 1726883128.66589: Calling all_inventory to load vars for managed_node2 28983 1726883128.66592: Calling groups_inventory to load vars for managed_node2 28983 1726883128.66595: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883128.66603: Calling all_plugins_play to load vars for managed_node2 28983 1726883128.66606: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883128.66610: Calling groups_plugins_play to load vars for managed_node2 28983 1726883128.68061: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883128.69676: done with get_vars() 28983 1726883128.69703: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:45:28 -0400 (0:00:00.053) 0:02:38.695 ****** 28983 1726883128.69781: entering _queue_task() for managed_node2/service_facts 28983 1726883128.70012: worker is 1 (out of 1 available) 28983 1726883128.70026: exiting _queue_task() for managed_node2/service_facts 28983 1726883128.70042: done queuing things up, now waiting for results queue to drain 28983 1726883128.70044: waiting for pending results... 28983 1726883128.70249: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running 28983 1726883128.70385: in run() - task 0affe814-3a2d-b16d-c0a7-0000000026f0 28983 1726883128.70396: variable 'ansible_search_path' from source: unknown 28983 1726883128.70399: variable 'ansible_search_path' from source: unknown 28983 1726883128.70428: calling self._execute() 28983 1726883128.70522: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883128.70530: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883128.70543: variable 'omit' from source: magic vars 28983 1726883128.70877: variable 'ansible_distribution_major_version' from source: facts 28983 1726883128.70888: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883128.70894: variable 'omit' from source: magic vars 28983 1726883128.70966: variable 'omit' from source: magic vars 28983 1726883128.70995: variable 'omit' from source: magic vars 28983 1726883128.71031: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883128.71066: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883128.71085: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883128.71101: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883128.71110: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883128.71139: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883128.71143: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883128.71153: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883128.71231: Set connection var ansible_connection to ssh 28983 1726883128.71243: Set connection var ansible_shell_executable to /bin/sh 28983 1726883128.71251: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883128.71263: Set connection var ansible_timeout to 10 28983 1726883128.71273: Set connection var ansible_pipelining to False 28983 1726883128.71276: Set connection var ansible_shell_type to sh 28983 1726883128.71297: variable 'ansible_shell_executable' from source: unknown 28983 1726883128.71300: variable 'ansible_connection' from source: unknown 28983 1726883128.71304: variable 'ansible_module_compression' from source: unknown 28983 1726883128.71306: variable 'ansible_shell_type' from source: unknown 28983 1726883128.71309: variable 'ansible_shell_executable' from source: unknown 28983 1726883128.71314: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883128.71319: variable 'ansible_pipelining' from source: unknown 28983 1726883128.71321: variable 'ansible_timeout' from source: unknown 28983 1726883128.71327: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883128.71496: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28983 1726883128.71507: variable 'omit' from source: magic vars 28983 1726883128.71514: starting attempt loop 28983 1726883128.71516: running the handler 28983 1726883128.71530: _low_level_execute_command(): starting 28983 1726883128.71539: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726883128.72095: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883128.72099: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883128.72103: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883128.72105: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883128.72161: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883128.72164: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883128.72247: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883128.74027: stdout chunk (state=3): >>>/root <<< 28983 1726883128.74139: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883128.74190: stderr chunk (state=3): >>><<< 28983 1726883128.74195: stdout chunk (state=3): >>><<< 28983 1726883128.74218: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883128.74230: _low_level_execute_command(): starting 28983 1726883128.74238: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726883128.7421777-34624-232451879358428 `" && echo ansible-tmp-1726883128.7421777-34624-232451879358428="` echo /root/.ansible/tmp/ansible-tmp-1726883128.7421777-34624-232451879358428 `" ) && sleep 0' 28983 1726883128.74695: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726883128.74699: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883128.74708: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found <<< 28983 1726883128.74710: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883128.74756: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883128.74765: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883128.74832: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883128.76855: stdout chunk (state=3): >>>ansible-tmp-1726883128.7421777-34624-232451879358428=/root/.ansible/tmp/ansible-tmp-1726883128.7421777-34624-232451879358428 <<< 28983 1726883128.76972: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883128.77011: stderr chunk (state=3): >>><<< 28983 1726883128.77014: stdout chunk (state=3): >>><<< 28983 1726883128.77032: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726883128.7421777-34624-232451879358428=/root/.ansible/tmp/ansible-tmp-1726883128.7421777-34624-232451879358428 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883128.77068: variable 'ansible_module_compression' from source: unknown 28983 1726883128.77104: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 28983 1726883128.77138: variable 'ansible_facts' from source: unknown 28983 1726883128.77204: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726883128.7421777-34624-232451879358428/AnsiballZ_service_facts.py 28983 1726883128.77315: Sending initial data 28983 1726883128.77319: Sent initial data (162 bytes) 28983 1726883128.77769: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883128.77773: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726883128.77776: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address <<< 28983 1726883128.77778: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883128.77780: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883128.77840: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883128.77843: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883128.77910: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883128.79583: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 28983 1726883128.79588: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726883128.79652: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726883128.79723: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmp753kxwlk /root/.ansible/tmp/ansible-tmp-1726883128.7421777-34624-232451879358428/AnsiballZ_service_facts.py <<< 28983 1726883128.79729: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726883128.7421777-34624-232451879358428/AnsiballZ_service_facts.py" <<< 28983 1726883128.79790: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmp753kxwlk" to remote "/root/.ansible/tmp/ansible-tmp-1726883128.7421777-34624-232451879358428/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726883128.7421777-34624-232451879358428/AnsiballZ_service_facts.py" <<< 28983 1726883128.80727: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883128.80785: stderr chunk (state=3): >>><<< 28983 1726883128.80788: stdout chunk (state=3): >>><<< 28983 1726883128.80805: done transferring module to remote 28983 1726883128.80817: _low_level_execute_command(): starting 28983 1726883128.80820: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726883128.7421777-34624-232451879358428/ /root/.ansible/tmp/ansible-tmp-1726883128.7421777-34624-232451879358428/AnsiballZ_service_facts.py && sleep 0' 28983 1726883128.81257: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883128.81261: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726883128.81263: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration <<< 28983 1726883128.81265: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726883128.81269: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883128.81317: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883128.81323: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883128.81393: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883128.83321: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883128.83364: stderr chunk (state=3): >>><<< 28983 1726883128.83368: stdout chunk (state=3): >>><<< 28983 1726883128.83384: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883128.83387: _low_level_execute_command(): starting 28983 1726883128.83392: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726883128.7421777-34624-232451879358428/AnsiballZ_service_facts.py && sleep 0' 28983 1726883128.83791: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883128.83819: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726883128.83823: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883128.83825: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883128.83827: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883128.83889: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883128.83893: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883128.83966: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883130.88068: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"network": {"name": "network", "state": "running", "status": "enabled", "source": "sysv"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "import-state.service": {"name": "import-state.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "ip6tables.service": {"name": "ip6tables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "iptables.service": {"name": "iptables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "generated", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state":<<< 28983 1726883130.88092: stdout chunk (state=3): >>> "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "sourc<<< 28983 1726883130.88105: stdout chunk (state=3): >>>e": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "stat<<< 28983 1726883130.88124: stdout chunk (state=3): >>>ic", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "loadmodules.service": {"name": "loadmodules.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "stati<<< 28983 1726883130.88158: stdout chunk (state=3): >>>c", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "sys<<< 28983 1726883130.88167: stdout chunk (state=3): >>>temd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 28983 1726883130.89782: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. <<< 28983 1726883130.89846: stderr chunk (state=3): >>><<< 28983 1726883130.89851: stdout chunk (state=3): >>><<< 28983 1726883130.89887: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"network": {"name": "network", "state": "running", "status": "enabled", "source": "sysv"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "import-state.service": {"name": "import-state.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "ip6tables.service": {"name": "ip6tables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "iptables.service": {"name": "iptables.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "generated", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "loadmodules.service": {"name": "loadmodules.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726883130.90599: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726883128.7421777-34624-232451879358428/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726883130.90608: _low_level_execute_command(): starting 28983 1726883130.90615: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726883128.7421777-34624-232451879358428/ > /dev/null 2>&1 && sleep 0' 28983 1726883130.91107: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883130.91110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883130.91113: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883130.91115: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883130.91170: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883130.91182: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883130.91252: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883130.93247: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883130.93297: stderr chunk (state=3): >>><<< 28983 1726883130.93300: stdout chunk (state=3): >>><<< 28983 1726883130.93316: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883130.93322: handler run complete 28983 1726883130.93496: variable 'ansible_facts' from source: unknown 28983 1726883130.93640: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883130.94091: variable 'ansible_facts' from source: unknown 28983 1726883130.94218: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883130.94666: attempt loop complete, returning result 28983 1726883130.94669: _execute() done 28983 1726883130.94674: dumping result to json 28983 1726883130.94677: done dumping result, returning 28983 1726883130.94679: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running [0affe814-3a2d-b16d-c0a7-0000000026f0] 28983 1726883130.94681: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000026f0 ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28983 1726883130.96209: no more pending results, returning what we have 28983 1726883130.96213: results queue empty 28983 1726883130.96214: checking for any_errors_fatal 28983 1726883130.96220: done checking for any_errors_fatal 28983 1726883130.96221: checking for max_fail_percentage 28983 1726883130.96223: done checking for max_fail_percentage 28983 1726883130.96225: checking to see if all hosts have failed and the running result is not ok 28983 1726883130.96226: done checking to see if all hosts have failed 28983 1726883130.96227: getting the remaining hosts for this loop 28983 1726883130.96228: done getting the remaining hosts for this loop 28983 1726883130.96232: getting the next task for host managed_node2 28983 1726883130.96243: done getting next task for host managed_node2 28983 1726883130.96248: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 28983 1726883130.96255: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883130.96283: getting variables 28983 1726883130.96286: in VariableManager get_vars() 28983 1726883130.96318: Calling all_inventory to load vars for managed_node2 28983 1726883130.96321: Calling groups_inventory to load vars for managed_node2 28983 1726883130.96322: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883130.96331: Calling all_plugins_play to load vars for managed_node2 28983 1726883130.96335: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883130.96339: Calling groups_plugins_play to load vars for managed_node2 28983 1726883130.96856: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000026f0 28983 1726883130.96860: WORKER PROCESS EXITING 28983 1726883130.97678: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883130.99298: done with get_vars() 28983 1726883130.99324: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:45:30 -0400 (0:00:02.296) 0:02:40.991 ****** 28983 1726883130.99407: entering _queue_task() for managed_node2/package_facts 28983 1726883130.99654: worker is 1 (out of 1 available) 28983 1726883130.99671: exiting _queue_task() for managed_node2/package_facts 28983 1726883130.99688: done queuing things up, now waiting for results queue to drain 28983 1726883130.99690: waiting for pending results... 28983 1726883130.99891: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 28983 1726883131.00015: in run() - task 0affe814-3a2d-b16d-c0a7-0000000026f1 28983 1726883131.00032: variable 'ansible_search_path' from source: unknown 28983 1726883131.00037: variable 'ansible_search_path' from source: unknown 28983 1726883131.00069: calling self._execute() 28983 1726883131.00159: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883131.00166: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883131.00178: variable 'omit' from source: magic vars 28983 1726883131.00509: variable 'ansible_distribution_major_version' from source: facts 28983 1726883131.00521: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883131.00527: variable 'omit' from source: magic vars 28983 1726883131.00598: variable 'omit' from source: magic vars 28983 1726883131.00625: variable 'omit' from source: magic vars 28983 1726883131.00663: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883131.00698: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883131.00715: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883131.00731: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883131.00743: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883131.00770: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883131.00776: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883131.00779: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883131.00862: Set connection var ansible_connection to ssh 28983 1726883131.00876: Set connection var ansible_shell_executable to /bin/sh 28983 1726883131.00884: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883131.00895: Set connection var ansible_timeout to 10 28983 1726883131.00901: Set connection var ansible_pipelining to False 28983 1726883131.00904: Set connection var ansible_shell_type to sh 28983 1726883131.00926: variable 'ansible_shell_executable' from source: unknown 28983 1726883131.00929: variable 'ansible_connection' from source: unknown 28983 1726883131.00932: variable 'ansible_module_compression' from source: unknown 28983 1726883131.00936: variable 'ansible_shell_type' from source: unknown 28983 1726883131.00939: variable 'ansible_shell_executable' from source: unknown 28983 1726883131.00944: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883131.00949: variable 'ansible_pipelining' from source: unknown 28983 1726883131.00953: variable 'ansible_timeout' from source: unknown 28983 1726883131.00958: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883131.01124: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28983 1726883131.01136: variable 'omit' from source: magic vars 28983 1726883131.01142: starting attempt loop 28983 1726883131.01147: running the handler 28983 1726883131.01160: _low_level_execute_command(): starting 28983 1726883131.01168: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726883131.01718: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883131.01722: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883131.01726: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883131.01728: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883131.01778: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883131.01781: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883131.01870: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883131.03656: stdout chunk (state=3): >>>/root <<< 28983 1726883131.03771: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883131.03820: stderr chunk (state=3): >>><<< 28983 1726883131.03824: stdout chunk (state=3): >>><<< 28983 1726883131.03845: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883131.03856: _low_level_execute_command(): starting 28983 1726883131.03862: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726883131.0384483-34667-6875405947154 `" && echo ansible-tmp-1726883131.0384483-34667-6875405947154="` echo /root/.ansible/tmp/ansible-tmp-1726883131.0384483-34667-6875405947154 `" ) && sleep 0' 28983 1726883131.04313: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883131.04317: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883131.04321: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883131.04330: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883131.04385: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883131.04387: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883131.04457: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883131.06494: stdout chunk (state=3): >>>ansible-tmp-1726883131.0384483-34667-6875405947154=/root/.ansible/tmp/ansible-tmp-1726883131.0384483-34667-6875405947154 <<< 28983 1726883131.06609: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883131.06656: stderr chunk (state=3): >>><<< 28983 1726883131.06660: stdout chunk (state=3): >>><<< 28983 1726883131.06676: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726883131.0384483-34667-6875405947154=/root/.ansible/tmp/ansible-tmp-1726883131.0384483-34667-6875405947154 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883131.06710: variable 'ansible_module_compression' from source: unknown 28983 1726883131.06750: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 28983 1726883131.06803: variable 'ansible_facts' from source: unknown 28983 1726883131.06941: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726883131.0384483-34667-6875405947154/AnsiballZ_package_facts.py 28983 1726883131.07056: Sending initial data 28983 1726883131.07060: Sent initial data (160 bytes) 28983 1726883131.07506: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883131.07510: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883131.07512: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883131.07514: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883131.07578: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883131.07581: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883131.07645: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883131.09325: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 28983 1726883131.09329: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726883131.09392: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726883131.09462: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpnemvtxvi /root/.ansible/tmp/ansible-tmp-1726883131.0384483-34667-6875405947154/AnsiballZ_package_facts.py <<< 28983 1726883131.09471: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726883131.0384483-34667-6875405947154/AnsiballZ_package_facts.py" <<< 28983 1726883131.09530: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpnemvtxvi" to remote "/root/.ansible/tmp/ansible-tmp-1726883131.0384483-34667-6875405947154/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726883131.0384483-34667-6875405947154/AnsiballZ_package_facts.py" <<< 28983 1726883131.11360: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883131.11409: stderr chunk (state=3): >>><<< 28983 1726883131.11413: stdout chunk (state=3): >>><<< 28983 1726883131.11431: done transferring module to remote 28983 1726883131.11442: _low_level_execute_command(): starting 28983 1726883131.11450: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726883131.0384483-34667-6875405947154/ /root/.ansible/tmp/ansible-tmp-1726883131.0384483-34667-6875405947154/AnsiballZ_package_facts.py && sleep 0' 28983 1726883131.11879: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883131.11884: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883131.11887: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration <<< 28983 1726883131.11889: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found <<< 28983 1726883131.11892: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883131.11946: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883131.11954: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883131.12025: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883131.13927: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883131.13973: stderr chunk (state=3): >>><<< 28983 1726883131.13978: stdout chunk (state=3): >>><<< 28983 1726883131.13993: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883131.13996: _low_level_execute_command(): starting 28983 1726883131.14002: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726883131.0384483-34667-6875405947154/AnsiballZ_package_facts.py && sleep 0' 28983 1726883131.14392: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883131.14426: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726883131.14429: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883131.14431: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883131.14437: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883131.14491: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883131.14494: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883131.14575: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883131.77754: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "r<<< 28983 1726883131.77778: stdout chunk (state=3): >>>pm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "relea<<< 28983 1726883131.77804: stdout chunk (state=3): >>>se": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, <<< 28983 1726883131.77827: stdout chunk (state=3): >>>"arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release"<<< 28983 1726883131.77850: stdout chunk (state=3): >>>: "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils",<<< 28983 1726883131.77861: stdout chunk (state=3): >>> "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "<<< 28983 1726883131.77901: stdout chunk (state=3): >>>version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.fc3<<< 28983 1726883131.77909: stdout chunk (state=3): >>>9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", <<< 28983 1726883131.77914: stdout chunk (state=3): >>>"release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "sou<<< 28983 1726883131.77947: stdout chunk (state=3): >>>rce": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "a<<< 28983 1726883131.77965: stdout chunk (state=3): >>>spell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch": "n<<< 28983 1726883131.77981: stdout chunk (state=3): >>>oarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_64", <<< 28983 1726883131.77996: stdout chunk (state=3): >>>"source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": null, "a<<< 28983 1726883131.78024: stdout chunk (state=3): >>>rch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "9.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chkconfig": [{"name": "chkconfig", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts": [{"name": "initscripts", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "network-scripts": [{"name": "network-scripts", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 28983 1726883131.79857: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. <<< 28983 1726883131.79917: stderr chunk (state=3): >>><<< 28983 1726883131.79920: stdout chunk (state=3): >>><<< 28983 1726883131.79966: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.9", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "9.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chkconfig": [{"name": "chkconfig", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts": [{"name": "initscripts", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "network-scripts": [{"name": "network-scripts", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726883131.82318: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726883131.0384483-34667-6875405947154/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726883131.82337: _low_level_execute_command(): starting 28983 1726883131.82343: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726883131.0384483-34667-6875405947154/ > /dev/null 2>&1 && sleep 0' 28983 1726883131.82805: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883131.82816: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883131.82846: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 28983 1726883131.82852: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883131.82912: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883131.82915: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883131.82919: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883131.82993: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883131.84980: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883131.85032: stderr chunk (state=3): >>><<< 28983 1726883131.85037: stdout chunk (state=3): >>><<< 28983 1726883131.85050: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883131.85057: handler run complete 28983 1726883131.85882: variable 'ansible_facts' from source: unknown 28983 1726883131.86348: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883131.88435: variable 'ansible_facts' from source: unknown 28983 1726883131.88883: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883131.89652: attempt loop complete, returning result 28983 1726883131.89670: _execute() done 28983 1726883131.89673: dumping result to json 28983 1726883131.89859: done dumping result, returning 28983 1726883131.89867: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affe814-3a2d-b16d-c0a7-0000000026f1] 28983 1726883131.89873: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000026f1 28983 1726883131.98620: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000026f1 28983 1726883131.98623: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28983 1726883131.98821: no more pending results, returning what we have 28983 1726883131.98824: results queue empty 28983 1726883131.98826: checking for any_errors_fatal 28983 1726883131.98833: done checking for any_errors_fatal 28983 1726883131.98846: checking for max_fail_percentage 28983 1726883131.98848: done checking for max_fail_percentage 28983 1726883131.98849: checking to see if all hosts have failed and the running result is not ok 28983 1726883131.98850: done checking to see if all hosts have failed 28983 1726883131.98851: getting the remaining hosts for this loop 28983 1726883131.98853: done getting the remaining hosts for this loop 28983 1726883131.98858: getting the next task for host managed_node2 28983 1726883131.98866: done getting next task for host managed_node2 28983 1726883131.98871: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 28983 1726883131.98880: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883131.98898: getting variables 28983 1726883131.98900: in VariableManager get_vars() 28983 1726883131.98944: Calling all_inventory to load vars for managed_node2 28983 1726883131.98954: Calling groups_inventory to load vars for managed_node2 28983 1726883131.98958: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883131.98968: Calling all_plugins_play to load vars for managed_node2 28983 1726883131.98975: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883131.98979: Calling groups_plugins_play to load vars for managed_node2 28983 1726883132.01336: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883132.04322: done with get_vars() 28983 1726883132.04359: done getting variables 28983 1726883132.04412: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:45:32 -0400 (0:00:01.050) 0:02:42.042 ****** 28983 1726883132.04453: entering _queue_task() for managed_node2/debug 28983 1726883132.04742: worker is 1 (out of 1 available) 28983 1726883132.04758: exiting _queue_task() for managed_node2/debug 28983 1726883132.04776: done queuing things up, now waiting for results queue to drain 28983 1726883132.04778: waiting for pending results... 28983 1726883132.04990: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 28983 1726883132.05122: in run() - task 0affe814-3a2d-b16d-c0a7-000000002695 28983 1726883132.05138: variable 'ansible_search_path' from source: unknown 28983 1726883132.05142: variable 'ansible_search_path' from source: unknown 28983 1726883132.05178: calling self._execute() 28983 1726883132.05271: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883132.05277: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883132.05290: variable 'omit' from source: magic vars 28983 1726883132.05623: variable 'ansible_distribution_major_version' from source: facts 28983 1726883132.05636: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883132.05644: variable 'omit' from source: magic vars 28983 1726883132.05705: variable 'omit' from source: magic vars 28983 1726883132.05787: variable 'network_provider' from source: set_fact 28983 1726883132.05803: variable 'omit' from source: magic vars 28983 1726883132.05841: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883132.05876: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883132.05893: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883132.05930: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883132.05935: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883132.05963: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883132.05967: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883132.05971: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883132.06159: Set connection var ansible_connection to ssh 28983 1726883132.06162: Set connection var ansible_shell_executable to /bin/sh 28983 1726883132.06165: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883132.06168: Set connection var ansible_timeout to 10 28983 1726883132.06171: Set connection var ansible_pipelining to False 28983 1726883132.06176: Set connection var ansible_shell_type to sh 28983 1726883132.06179: variable 'ansible_shell_executable' from source: unknown 28983 1726883132.06182: variable 'ansible_connection' from source: unknown 28983 1726883132.06184: variable 'ansible_module_compression' from source: unknown 28983 1726883132.06186: variable 'ansible_shell_type' from source: unknown 28983 1726883132.06189: variable 'ansible_shell_executable' from source: unknown 28983 1726883132.06191: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883132.06193: variable 'ansible_pipelining' from source: unknown 28983 1726883132.06196: variable 'ansible_timeout' from source: unknown 28983 1726883132.06198: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883132.06374: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883132.06468: variable 'omit' from source: magic vars 28983 1726883132.06471: starting attempt loop 28983 1726883132.06477: running the handler 28983 1726883132.06493: handler run complete 28983 1726883132.06519: attempt loop complete, returning result 28983 1726883132.06527: _execute() done 28983 1726883132.06536: dumping result to json 28983 1726883132.06545: done dumping result, returning 28983 1726883132.06559: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [0affe814-3a2d-b16d-c0a7-000000002695] 28983 1726883132.06582: sending task result for task 0affe814-3a2d-b16d-c0a7-000000002695 28983 1726883132.06856: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000002695 28983 1726883132.06860: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: Using network provider: nm 28983 1726883132.06959: no more pending results, returning what we have 28983 1726883132.06963: results queue empty 28983 1726883132.06964: checking for any_errors_fatal 28983 1726883132.06984: done checking for any_errors_fatal 28983 1726883132.06985: checking for max_fail_percentage 28983 1726883132.06988: done checking for max_fail_percentage 28983 1726883132.06989: checking to see if all hosts have failed and the running result is not ok 28983 1726883132.06990: done checking to see if all hosts have failed 28983 1726883132.06991: getting the remaining hosts for this loop 28983 1726883132.06994: done getting the remaining hosts for this loop 28983 1726883132.06999: getting the next task for host managed_node2 28983 1726883132.07010: done getting next task for host managed_node2 28983 1726883132.07014: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 28983 1726883132.07020: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883132.07051: getting variables 28983 1726883132.07053: in VariableManager get_vars() 28983 1726883132.07111: Calling all_inventory to load vars for managed_node2 28983 1726883132.07114: Calling groups_inventory to load vars for managed_node2 28983 1726883132.07116: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883132.07126: Calling all_plugins_play to load vars for managed_node2 28983 1726883132.07129: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883132.07133: Calling groups_plugins_play to load vars for managed_node2 28983 1726883132.14559: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883132.17543: done with get_vars() 28983 1726883132.17587: done getting variables 28983 1726883132.17646: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:45:32 -0400 (0:00:00.132) 0:02:42.174 ****** 28983 1726883132.17692: entering _queue_task() for managed_node2/fail 28983 1726883132.18091: worker is 1 (out of 1 available) 28983 1726883132.18105: exiting _queue_task() for managed_node2/fail 28983 1726883132.18118: done queuing things up, now waiting for results queue to drain 28983 1726883132.18121: waiting for pending results... 28983 1726883132.18467: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 28983 1726883132.18591: in run() - task 0affe814-3a2d-b16d-c0a7-000000002696 28983 1726883132.18671: variable 'ansible_search_path' from source: unknown 28983 1726883132.18681: variable 'ansible_search_path' from source: unknown 28983 1726883132.18686: calling self._execute() 28983 1726883132.18782: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883132.18788: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883132.18809: variable 'omit' from source: magic vars 28983 1726883132.19295: variable 'ansible_distribution_major_version' from source: facts 28983 1726883132.19310: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883132.19487: variable 'network_state' from source: role '' defaults 28983 1726883132.19542: Evaluated conditional (network_state != {}): False 28983 1726883132.19546: when evaluation is False, skipping this task 28983 1726883132.19549: _execute() done 28983 1726883132.19552: dumping result to json 28983 1726883132.19554: done dumping result, returning 28983 1726883132.19557: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affe814-3a2d-b16d-c0a7-000000002696] 28983 1726883132.19561: sending task result for task 0affe814-3a2d-b16d-c0a7-000000002696 28983 1726883132.19851: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000002696 28983 1726883132.19854: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28983 1726883132.19900: no more pending results, returning what we have 28983 1726883132.19903: results queue empty 28983 1726883132.19904: checking for any_errors_fatal 28983 1726883132.19912: done checking for any_errors_fatal 28983 1726883132.19912: checking for max_fail_percentage 28983 1726883132.19914: done checking for max_fail_percentage 28983 1726883132.19915: checking to see if all hosts have failed and the running result is not ok 28983 1726883132.19916: done checking to see if all hosts have failed 28983 1726883132.19917: getting the remaining hosts for this loop 28983 1726883132.19919: done getting the remaining hosts for this loop 28983 1726883132.19922: getting the next task for host managed_node2 28983 1726883132.19930: done getting next task for host managed_node2 28983 1726883132.19937: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 28983 1726883132.19943: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883132.19970: getting variables 28983 1726883132.19974: in VariableManager get_vars() 28983 1726883132.20019: Calling all_inventory to load vars for managed_node2 28983 1726883132.20022: Calling groups_inventory to load vars for managed_node2 28983 1726883132.20025: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883132.20037: Calling all_plugins_play to load vars for managed_node2 28983 1726883132.20041: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883132.20046: Calling groups_plugins_play to load vars for managed_node2 28983 1726883132.22219: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883132.25231: done with get_vars() 28983 1726883132.25270: done getting variables 28983 1726883132.25340: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:45:32 -0400 (0:00:00.076) 0:02:42.251 ****** 28983 1726883132.25384: entering _queue_task() for managed_node2/fail 28983 1726883132.25712: worker is 1 (out of 1 available) 28983 1726883132.25726: exiting _queue_task() for managed_node2/fail 28983 1726883132.25742: done queuing things up, now waiting for results queue to drain 28983 1726883132.25744: waiting for pending results... 28983 1726883132.26155: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 28983 1726883132.26287: in run() - task 0affe814-3a2d-b16d-c0a7-000000002697 28983 1726883132.26312: variable 'ansible_search_path' from source: unknown 28983 1726883132.26322: variable 'ansible_search_path' from source: unknown 28983 1726883132.26378: calling self._execute() 28983 1726883132.26507: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883132.26521: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883132.26542: variable 'omit' from source: magic vars 28983 1726883132.27002: variable 'ansible_distribution_major_version' from source: facts 28983 1726883132.27025: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883132.27199: variable 'network_state' from source: role '' defaults 28983 1726883132.27216: Evaluated conditional (network_state != {}): False 28983 1726883132.27228: when evaluation is False, skipping this task 28983 1726883132.27239: _execute() done 28983 1726883132.27248: dumping result to json 28983 1726883132.27259: done dumping result, returning 28983 1726883132.27274: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affe814-3a2d-b16d-c0a7-000000002697] 28983 1726883132.27289: sending task result for task 0affe814-3a2d-b16d-c0a7-000000002697 28983 1726883132.27640: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000002697 28983 1726883132.27644: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28983 1726883132.27691: no more pending results, returning what we have 28983 1726883132.27695: results queue empty 28983 1726883132.27696: checking for any_errors_fatal 28983 1726883132.27702: done checking for any_errors_fatal 28983 1726883132.27703: checking for max_fail_percentage 28983 1726883132.27706: done checking for max_fail_percentage 28983 1726883132.27707: checking to see if all hosts have failed and the running result is not ok 28983 1726883132.27707: done checking to see if all hosts have failed 28983 1726883132.27708: getting the remaining hosts for this loop 28983 1726883132.27710: done getting the remaining hosts for this loop 28983 1726883132.27714: getting the next task for host managed_node2 28983 1726883132.27723: done getting next task for host managed_node2 28983 1726883132.27727: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 28983 1726883132.27735: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883132.27763: getting variables 28983 1726883132.27764: in VariableManager get_vars() 28983 1726883132.27808: Calling all_inventory to load vars for managed_node2 28983 1726883132.27811: Calling groups_inventory to load vars for managed_node2 28983 1726883132.27814: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883132.27823: Calling all_plugins_play to load vars for managed_node2 28983 1726883132.27826: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883132.27830: Calling groups_plugins_play to load vars for managed_node2 28983 1726883132.29836: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883132.31724: done with get_vars() 28983 1726883132.31760: done getting variables 28983 1726883132.31838: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:45:32 -0400 (0:00:00.064) 0:02:42.316 ****** 28983 1726883132.31880: entering _queue_task() for managed_node2/fail 28983 1726883132.32205: worker is 1 (out of 1 available) 28983 1726883132.32221: exiting _queue_task() for managed_node2/fail 28983 1726883132.32236: done queuing things up, now waiting for results queue to drain 28983 1726883132.32238: waiting for pending results... 28983 1726883132.32493: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 28983 1726883132.32702: in run() - task 0affe814-3a2d-b16d-c0a7-000000002698 28983 1726883132.32736: variable 'ansible_search_path' from source: unknown 28983 1726883132.32741: variable 'ansible_search_path' from source: unknown 28983 1726883132.32779: calling self._execute() 28983 1726883132.32905: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883132.32913: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883132.32928: variable 'omit' from source: magic vars 28983 1726883132.33453: variable 'ansible_distribution_major_version' from source: facts 28983 1726883132.33458: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883132.33665: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883132.35654: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883132.35657: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883132.35840: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883132.35845: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883132.35848: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883132.35877: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883132.35902: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883132.35933: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883132.35984: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883132.36001: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883132.36120: variable 'ansible_distribution_major_version' from source: facts 28983 1726883132.36137: Evaluated conditional (ansible_distribution_major_version | int > 9): True 28983 1726883132.36280: variable 'ansible_distribution' from source: facts 28983 1726883132.36289: variable '__network_rh_distros' from source: role '' defaults 28983 1726883132.36302: Evaluated conditional (ansible_distribution in __network_rh_distros): False 28983 1726883132.36306: when evaluation is False, skipping this task 28983 1726883132.36308: _execute() done 28983 1726883132.36311: dumping result to json 28983 1726883132.36314: done dumping result, returning 28983 1726883132.36339: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affe814-3a2d-b16d-c0a7-000000002698] 28983 1726883132.36343: sending task result for task 0affe814-3a2d-b16d-c0a7-000000002698 28983 1726883132.36492: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000002698 28983 1726883132.36495: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution in __network_rh_distros", "skip_reason": "Conditional result was False" } 28983 1726883132.36562: no more pending results, returning what we have 28983 1726883132.36566: results queue empty 28983 1726883132.36567: checking for any_errors_fatal 28983 1726883132.36576: done checking for any_errors_fatal 28983 1726883132.36577: checking for max_fail_percentage 28983 1726883132.36579: done checking for max_fail_percentage 28983 1726883132.36580: checking to see if all hosts have failed and the running result is not ok 28983 1726883132.36581: done checking to see if all hosts have failed 28983 1726883132.36581: getting the remaining hosts for this loop 28983 1726883132.36583: done getting the remaining hosts for this loop 28983 1726883132.36588: getting the next task for host managed_node2 28983 1726883132.36596: done getting next task for host managed_node2 28983 1726883132.36602: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 28983 1726883132.36607: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883132.36732: getting variables 28983 1726883132.36737: in VariableManager get_vars() 28983 1726883132.36780: Calling all_inventory to load vars for managed_node2 28983 1726883132.36783: Calling groups_inventory to load vars for managed_node2 28983 1726883132.36786: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883132.36801: Calling all_plugins_play to load vars for managed_node2 28983 1726883132.36805: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883132.36809: Calling groups_plugins_play to load vars for managed_node2 28983 1726883132.38249: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883132.39882: done with get_vars() 28983 1726883132.39905: done getting variables 28983 1726883132.39957: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:45:32 -0400 (0:00:00.081) 0:02:42.397 ****** 28983 1726883132.39988: entering _queue_task() for managed_node2/dnf 28983 1726883132.40242: worker is 1 (out of 1 available) 28983 1726883132.40258: exiting _queue_task() for managed_node2/dnf 28983 1726883132.40274: done queuing things up, now waiting for results queue to drain 28983 1726883132.40276: waiting for pending results... 28983 1726883132.40482: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 28983 1726883132.40608: in run() - task 0affe814-3a2d-b16d-c0a7-000000002699 28983 1726883132.40625: variable 'ansible_search_path' from source: unknown 28983 1726883132.40628: variable 'ansible_search_path' from source: unknown 28983 1726883132.40660: calling self._execute() 28983 1726883132.40745: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883132.40751: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883132.40763: variable 'omit' from source: magic vars 28983 1726883132.41090: variable 'ansible_distribution_major_version' from source: facts 28983 1726883132.41100: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883132.41280: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883132.43076: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883132.43140: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883132.43171: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883132.43204: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883132.43228: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883132.43300: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883132.43322: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883132.43348: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883132.43384: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883132.43397: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883132.43494: variable 'ansible_distribution' from source: facts 28983 1726883132.43497: variable 'ansible_distribution_major_version' from source: facts 28983 1726883132.43505: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 28983 1726883132.43600: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883132.43716: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883132.43737: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883132.43759: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883132.43797: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883132.43809: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883132.43847: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883132.43866: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883132.43892: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883132.43924: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883132.43938: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883132.43974: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883132.43996: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883132.44018: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883132.44051: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883132.44063: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883132.44202: variable 'network_connections' from source: include params 28983 1726883132.44213: variable 'interface' from source: play vars 28983 1726883132.44267: variable 'interface' from source: play vars 28983 1726883132.44332: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883132.44459: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883132.44502: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883132.44532: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883132.44561: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883132.44596: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883132.44614: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883132.44641: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883132.44666: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883132.44707: variable '__network_team_connections_defined' from source: role '' defaults 28983 1726883132.44911: variable 'network_connections' from source: include params 28983 1726883132.44916: variable 'interface' from source: play vars 28983 1726883132.44968: variable 'interface' from source: play vars 28983 1726883132.44993: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 28983 1726883132.44997: when evaluation is False, skipping this task 28983 1726883132.45000: _execute() done 28983 1726883132.45005: dumping result to json 28983 1726883132.45009: done dumping result, returning 28983 1726883132.45016: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affe814-3a2d-b16d-c0a7-000000002699] 28983 1726883132.45022: sending task result for task 0affe814-3a2d-b16d-c0a7-000000002699 28983 1726883132.45120: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000002699 28983 1726883132.45124: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 28983 1726883132.45189: no more pending results, returning what we have 28983 1726883132.45193: results queue empty 28983 1726883132.45194: checking for any_errors_fatal 28983 1726883132.45203: done checking for any_errors_fatal 28983 1726883132.45204: checking for max_fail_percentage 28983 1726883132.45206: done checking for max_fail_percentage 28983 1726883132.45207: checking to see if all hosts have failed and the running result is not ok 28983 1726883132.45208: done checking to see if all hosts have failed 28983 1726883132.45209: getting the remaining hosts for this loop 28983 1726883132.45212: done getting the remaining hosts for this loop 28983 1726883132.45216: getting the next task for host managed_node2 28983 1726883132.45225: done getting next task for host managed_node2 28983 1726883132.45229: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 28983 1726883132.45238: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883132.45268: getting variables 28983 1726883132.45270: in VariableManager get_vars() 28983 1726883132.45314: Calling all_inventory to load vars for managed_node2 28983 1726883132.45317: Calling groups_inventory to load vars for managed_node2 28983 1726883132.45320: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883132.45328: Calling all_plugins_play to load vars for managed_node2 28983 1726883132.45332: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883132.45343: Calling groups_plugins_play to load vars for managed_node2 28983 1726883132.46594: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883132.48207: done with get_vars() 28983 1726883132.48230: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 28983 1726883132.48295: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:45:32 -0400 (0:00:00.083) 0:02:42.481 ****** 28983 1726883132.48322: entering _queue_task() for managed_node2/yum 28983 1726883132.48550: worker is 1 (out of 1 available) 28983 1726883132.48564: exiting _queue_task() for managed_node2/yum 28983 1726883132.48581: done queuing things up, now waiting for results queue to drain 28983 1726883132.48583: waiting for pending results... 28983 1726883132.48779: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 28983 1726883132.48893: in run() - task 0affe814-3a2d-b16d-c0a7-00000000269a 28983 1726883132.48908: variable 'ansible_search_path' from source: unknown 28983 1726883132.48913: variable 'ansible_search_path' from source: unknown 28983 1726883132.48949: calling self._execute() 28983 1726883132.49043: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883132.49047: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883132.49057: variable 'omit' from source: magic vars 28983 1726883132.49391: variable 'ansible_distribution_major_version' from source: facts 28983 1726883132.49401: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883132.49558: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883132.51614: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883132.51678: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883132.51708: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883132.51740: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883132.51766: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883132.51831: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883132.51855: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883132.51886: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883132.51915: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883132.51927: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883132.52008: variable 'ansible_distribution_major_version' from source: facts 28983 1726883132.52020: Evaluated conditional (ansible_distribution_major_version | int < 8): False 28983 1726883132.52023: when evaluation is False, skipping this task 28983 1726883132.52026: _execute() done 28983 1726883132.52032: dumping result to json 28983 1726883132.52038: done dumping result, returning 28983 1726883132.52045: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affe814-3a2d-b16d-c0a7-00000000269a] 28983 1726883132.52051: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000269a 28983 1726883132.52151: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000269a 28983 1726883132.52154: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 28983 1726883132.52211: no more pending results, returning what we have 28983 1726883132.52214: results queue empty 28983 1726883132.52215: checking for any_errors_fatal 28983 1726883132.52221: done checking for any_errors_fatal 28983 1726883132.52222: checking for max_fail_percentage 28983 1726883132.52224: done checking for max_fail_percentage 28983 1726883132.52225: checking to see if all hosts have failed and the running result is not ok 28983 1726883132.52226: done checking to see if all hosts have failed 28983 1726883132.52227: getting the remaining hosts for this loop 28983 1726883132.52229: done getting the remaining hosts for this loop 28983 1726883132.52233: getting the next task for host managed_node2 28983 1726883132.52244: done getting next task for host managed_node2 28983 1726883132.52249: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 28983 1726883132.52254: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883132.52286: getting variables 28983 1726883132.52287: in VariableManager get_vars() 28983 1726883132.52328: Calling all_inventory to load vars for managed_node2 28983 1726883132.52331: Calling groups_inventory to load vars for managed_node2 28983 1726883132.52341: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883132.52351: Calling all_plugins_play to load vars for managed_node2 28983 1726883132.52354: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883132.52358: Calling groups_plugins_play to load vars for managed_node2 28983 1726883132.53719: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883132.55316: done with get_vars() 28983 1726883132.55340: done getting variables 28983 1726883132.55391: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:45:32 -0400 (0:00:00.070) 0:02:42.551 ****** 28983 1726883132.55419: entering _queue_task() for managed_node2/fail 28983 1726883132.55659: worker is 1 (out of 1 available) 28983 1726883132.55675: exiting _queue_task() for managed_node2/fail 28983 1726883132.55689: done queuing things up, now waiting for results queue to drain 28983 1726883132.55691: waiting for pending results... 28983 1726883132.55898: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 28983 1726883132.56035: in run() - task 0affe814-3a2d-b16d-c0a7-00000000269b 28983 1726883132.56051: variable 'ansible_search_path' from source: unknown 28983 1726883132.56056: variable 'ansible_search_path' from source: unknown 28983 1726883132.56091: calling self._execute() 28983 1726883132.56179: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883132.56185: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883132.56197: variable 'omit' from source: magic vars 28983 1726883132.56526: variable 'ansible_distribution_major_version' from source: facts 28983 1726883132.56537: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883132.56654: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883132.56833: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883132.58630: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883132.58698: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883132.58731: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883132.58767: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883132.58794: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883132.58865: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883132.58893: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883132.58914: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883132.58948: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883132.58962: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883132.59008: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883132.59027: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883132.59049: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883132.59088: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883132.59101: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883132.59138: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883132.59157: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883132.59180: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883132.59215: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883132.59227: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883132.59380: variable 'network_connections' from source: include params 28983 1726883132.59393: variable 'interface' from source: play vars 28983 1726883132.59449: variable 'interface' from source: play vars 28983 1726883132.59515: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883132.59646: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883132.59813: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883132.59847: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883132.59873: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883132.59910: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883132.59929: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883132.59958: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883132.59983: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883132.60024: variable '__network_team_connections_defined' from source: role '' defaults 28983 1726883132.60236: variable 'network_connections' from source: include params 28983 1726883132.60240: variable 'interface' from source: play vars 28983 1726883132.60299: variable 'interface' from source: play vars 28983 1726883132.60319: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 28983 1726883132.60323: when evaluation is False, skipping this task 28983 1726883132.60326: _execute() done 28983 1726883132.60331: dumping result to json 28983 1726883132.60336: done dumping result, returning 28983 1726883132.60344: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affe814-3a2d-b16d-c0a7-00000000269b] 28983 1726883132.60350: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000269b 28983 1726883132.60449: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000269b 28983 1726883132.60452: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 28983 1726883132.60527: no more pending results, returning what we have 28983 1726883132.60531: results queue empty 28983 1726883132.60532: checking for any_errors_fatal 28983 1726883132.60544: done checking for any_errors_fatal 28983 1726883132.60545: checking for max_fail_percentage 28983 1726883132.60547: done checking for max_fail_percentage 28983 1726883132.60548: checking to see if all hosts have failed and the running result is not ok 28983 1726883132.60549: done checking to see if all hosts have failed 28983 1726883132.60550: getting the remaining hosts for this loop 28983 1726883132.60552: done getting the remaining hosts for this loop 28983 1726883132.60557: getting the next task for host managed_node2 28983 1726883132.60568: done getting next task for host managed_node2 28983 1726883132.60572: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 28983 1726883132.60579: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883132.60608: getting variables 28983 1726883132.60610: in VariableManager get_vars() 28983 1726883132.60663: Calling all_inventory to load vars for managed_node2 28983 1726883132.60666: Calling groups_inventory to load vars for managed_node2 28983 1726883132.60668: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883132.60677: Calling all_plugins_play to load vars for managed_node2 28983 1726883132.60680: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883132.60683: Calling groups_plugins_play to load vars for managed_node2 28983 1726883132.62046: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883132.63677: done with get_vars() 28983 1726883132.63702: done getting variables 28983 1726883132.63751: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:45:32 -0400 (0:00:00.083) 0:02:42.635 ****** 28983 1726883132.63784: entering _queue_task() for managed_node2/package 28983 1726883132.64014: worker is 1 (out of 1 available) 28983 1726883132.64029: exiting _queue_task() for managed_node2/package 28983 1726883132.64043: done queuing things up, now waiting for results queue to drain 28983 1726883132.64044: waiting for pending results... 28983 1726883132.64249: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 28983 1726883132.64366: in run() - task 0affe814-3a2d-b16d-c0a7-00000000269c 28983 1726883132.64383: variable 'ansible_search_path' from source: unknown 28983 1726883132.64389: variable 'ansible_search_path' from source: unknown 28983 1726883132.64419: calling self._execute() 28983 1726883132.64510: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883132.64517: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883132.64528: variable 'omit' from source: magic vars 28983 1726883132.64853: variable 'ansible_distribution_major_version' from source: facts 28983 1726883132.64864: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883132.65042: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883132.65264: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883132.65306: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883132.65333: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883132.65394: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883132.65497: variable 'network_packages' from source: role '' defaults 28983 1726883132.65586: variable '__network_provider_setup' from source: role '' defaults 28983 1726883132.65597: variable '__network_service_name_default_nm' from source: role '' defaults 28983 1726883132.65651: variable '__network_service_name_default_nm' from source: role '' defaults 28983 1726883132.65658: variable '__network_packages_default_nm' from source: role '' defaults 28983 1726883132.65713: variable '__network_packages_default_nm' from source: role '' defaults 28983 1726883132.65878: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883132.67452: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883132.67506: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883132.67536: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883132.67569: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883132.67593: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883132.67675: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883132.67699: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883132.67720: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883132.67755: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883132.67771: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883132.67810: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883132.67829: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883132.67851: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883132.67890: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883132.67900: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883132.68092: variable '__network_packages_default_gobject_packages' from source: role '' defaults 28983 1726883132.68185: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883132.68209: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883132.68231: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883132.68263: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883132.68277: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883132.68355: variable 'ansible_python' from source: facts 28983 1726883132.68370: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 28983 1726883132.68443: variable '__network_wpa_supplicant_required' from source: role '' defaults 28983 1726883132.68509: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 28983 1726883132.68617: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883132.68642: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883132.68664: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883132.68696: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883132.68709: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883132.68753: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883132.68779: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883132.68799: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883132.68829: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883132.68846: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883132.68967: variable 'network_connections' from source: include params 28983 1726883132.68976: variable 'interface' from source: play vars 28983 1726883132.69058: variable 'interface' from source: play vars 28983 1726883132.69118: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883132.69141: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883132.69166: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883132.69196: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883132.69240: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883132.69481: variable 'network_connections' from source: include params 28983 1726883132.69485: variable 'interface' from source: play vars 28983 1726883132.69569: variable 'interface' from source: play vars 28983 1726883132.69596: variable '__network_packages_default_wireless' from source: role '' defaults 28983 1726883132.69664: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883132.69919: variable 'network_connections' from source: include params 28983 1726883132.69923: variable 'interface' from source: play vars 28983 1726883132.69982: variable 'interface' from source: play vars 28983 1726883132.70001: variable '__network_packages_default_team' from source: role '' defaults 28983 1726883132.70068: variable '__network_team_connections_defined' from source: role '' defaults 28983 1726883132.70322: variable 'network_connections' from source: include params 28983 1726883132.70326: variable 'interface' from source: play vars 28983 1726883132.70383: variable 'interface' from source: play vars 28983 1726883132.70428: variable '__network_service_name_default_initscripts' from source: role '' defaults 28983 1726883132.70479: variable '__network_service_name_default_initscripts' from source: role '' defaults 28983 1726883132.70489: variable '__network_packages_default_initscripts' from source: role '' defaults 28983 1726883132.70539: variable '__network_packages_default_initscripts' from source: role '' defaults 28983 1726883132.70726: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 28983 1726883132.71126: variable 'network_connections' from source: include params 28983 1726883132.71130: variable 'interface' from source: play vars 28983 1726883132.71185: variable 'interface' from source: play vars 28983 1726883132.71192: variable 'ansible_distribution' from source: facts 28983 1726883132.71195: variable '__network_rh_distros' from source: role '' defaults 28983 1726883132.71203: variable 'ansible_distribution_major_version' from source: facts 28983 1726883132.71215: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 28983 1726883132.71357: variable 'ansible_distribution' from source: facts 28983 1726883132.71361: variable '__network_rh_distros' from source: role '' defaults 28983 1726883132.71367: variable 'ansible_distribution_major_version' from source: facts 28983 1726883132.71377: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 28983 1726883132.71513: variable 'ansible_distribution' from source: facts 28983 1726883132.71517: variable '__network_rh_distros' from source: role '' defaults 28983 1726883132.71523: variable 'ansible_distribution_major_version' from source: facts 28983 1726883132.71554: variable 'network_provider' from source: set_fact 28983 1726883132.71569: variable 'ansible_facts' from source: unknown 28983 1726883132.72275: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 28983 1726883132.72279: when evaluation is False, skipping this task 28983 1726883132.72282: _execute() done 28983 1726883132.72284: dumping result to json 28983 1726883132.72286: done dumping result, returning 28983 1726883132.72295: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [0affe814-3a2d-b16d-c0a7-00000000269c] 28983 1726883132.72301: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000269c 28983 1726883132.72404: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000269c 28983 1726883132.72408: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 28983 1726883132.72491: no more pending results, returning what we have 28983 1726883132.72495: results queue empty 28983 1726883132.72496: checking for any_errors_fatal 28983 1726883132.72505: done checking for any_errors_fatal 28983 1726883132.72506: checking for max_fail_percentage 28983 1726883132.72508: done checking for max_fail_percentage 28983 1726883132.72509: checking to see if all hosts have failed and the running result is not ok 28983 1726883132.72510: done checking to see if all hosts have failed 28983 1726883132.72511: getting the remaining hosts for this loop 28983 1726883132.72513: done getting the remaining hosts for this loop 28983 1726883132.72526: getting the next task for host managed_node2 28983 1726883132.72537: done getting next task for host managed_node2 28983 1726883132.72543: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 28983 1726883132.72548: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883132.72580: getting variables 28983 1726883132.72582: in VariableManager get_vars() 28983 1726883132.72639: Calling all_inventory to load vars for managed_node2 28983 1726883132.72643: Calling groups_inventory to load vars for managed_node2 28983 1726883132.72645: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883132.72654: Calling all_plugins_play to load vars for managed_node2 28983 1726883132.72657: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883132.72661: Calling groups_plugins_play to load vars for managed_node2 28983 1726883132.73954: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883132.75575: done with get_vars() 28983 1726883132.75599: done getting variables 28983 1726883132.75649: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:45:32 -0400 (0:00:00.118) 0:02:42.754 ****** 28983 1726883132.75683: entering _queue_task() for managed_node2/package 28983 1726883132.75939: worker is 1 (out of 1 available) 28983 1726883132.75953: exiting _queue_task() for managed_node2/package 28983 1726883132.75968: done queuing things up, now waiting for results queue to drain 28983 1726883132.75970: waiting for pending results... 28983 1726883132.76171: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 28983 1726883132.76293: in run() - task 0affe814-3a2d-b16d-c0a7-00000000269d 28983 1726883132.76309: variable 'ansible_search_path' from source: unknown 28983 1726883132.76312: variable 'ansible_search_path' from source: unknown 28983 1726883132.76354: calling self._execute() 28983 1726883132.76449: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883132.76456: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883132.76466: variable 'omit' from source: magic vars 28983 1726883132.76808: variable 'ansible_distribution_major_version' from source: facts 28983 1726883132.76819: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883132.76930: variable 'network_state' from source: role '' defaults 28983 1726883132.76940: Evaluated conditional (network_state != {}): False 28983 1726883132.76944: when evaluation is False, skipping this task 28983 1726883132.76947: _execute() done 28983 1726883132.76951: dumping result to json 28983 1726883132.76958: done dumping result, returning 28983 1726883132.76964: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affe814-3a2d-b16d-c0a7-00000000269d] 28983 1726883132.76975: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000269d 28983 1726883132.77079: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000269d 28983 1726883132.77083: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28983 1726883132.77138: no more pending results, returning what we have 28983 1726883132.77142: results queue empty 28983 1726883132.77143: checking for any_errors_fatal 28983 1726883132.77149: done checking for any_errors_fatal 28983 1726883132.77150: checking for max_fail_percentage 28983 1726883132.77152: done checking for max_fail_percentage 28983 1726883132.77153: checking to see if all hosts have failed and the running result is not ok 28983 1726883132.77154: done checking to see if all hosts have failed 28983 1726883132.77155: getting the remaining hosts for this loop 28983 1726883132.77157: done getting the remaining hosts for this loop 28983 1726883132.77161: getting the next task for host managed_node2 28983 1726883132.77171: done getting next task for host managed_node2 28983 1726883132.77175: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 28983 1726883132.77181: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883132.77209: getting variables 28983 1726883132.77210: in VariableManager get_vars() 28983 1726883132.77259: Calling all_inventory to load vars for managed_node2 28983 1726883132.77262: Calling groups_inventory to load vars for managed_node2 28983 1726883132.77265: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883132.77273: Calling all_plugins_play to load vars for managed_node2 28983 1726883132.77277: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883132.77281: Calling groups_plugins_play to load vars for managed_node2 28983 1726883132.78633: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883132.80219: done with get_vars() 28983 1726883132.80243: done getting variables 28983 1726883132.80293: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:45:32 -0400 (0:00:00.046) 0:02:42.801 ****** 28983 1726883132.80321: entering _queue_task() for managed_node2/package 28983 1726883132.80548: worker is 1 (out of 1 available) 28983 1726883132.80562: exiting _queue_task() for managed_node2/package 28983 1726883132.80576: done queuing things up, now waiting for results queue to drain 28983 1726883132.80579: waiting for pending results... 28983 1726883132.80776: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 28983 1726883132.80889: in run() - task 0affe814-3a2d-b16d-c0a7-00000000269e 28983 1726883132.80903: variable 'ansible_search_path' from source: unknown 28983 1726883132.80906: variable 'ansible_search_path' from source: unknown 28983 1726883132.80942: calling self._execute() 28983 1726883132.81030: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883132.81035: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883132.81050: variable 'omit' from source: magic vars 28983 1726883132.81371: variable 'ansible_distribution_major_version' from source: facts 28983 1726883132.81385: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883132.81491: variable 'network_state' from source: role '' defaults 28983 1726883132.81503: Evaluated conditional (network_state != {}): False 28983 1726883132.81506: when evaluation is False, skipping this task 28983 1726883132.81509: _execute() done 28983 1726883132.81513: dumping result to json 28983 1726883132.81518: done dumping result, returning 28983 1726883132.81525: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affe814-3a2d-b16d-c0a7-00000000269e] 28983 1726883132.81531: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000269e 28983 1726883132.81636: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000269e 28983 1726883132.81639: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28983 1726883132.81690: no more pending results, returning what we have 28983 1726883132.81694: results queue empty 28983 1726883132.81695: checking for any_errors_fatal 28983 1726883132.81701: done checking for any_errors_fatal 28983 1726883132.81702: checking for max_fail_percentage 28983 1726883132.81704: done checking for max_fail_percentage 28983 1726883132.81705: checking to see if all hosts have failed and the running result is not ok 28983 1726883132.81706: done checking to see if all hosts have failed 28983 1726883132.81707: getting the remaining hosts for this loop 28983 1726883132.81709: done getting the remaining hosts for this loop 28983 1726883132.81714: getting the next task for host managed_node2 28983 1726883132.81722: done getting next task for host managed_node2 28983 1726883132.81727: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 28983 1726883132.81733: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883132.81761: getting variables 28983 1726883132.81762: in VariableManager get_vars() 28983 1726883132.81801: Calling all_inventory to load vars for managed_node2 28983 1726883132.81808: Calling groups_inventory to load vars for managed_node2 28983 1726883132.81811: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883132.81818: Calling all_plugins_play to load vars for managed_node2 28983 1726883132.81820: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883132.81822: Calling groups_plugins_play to load vars for managed_node2 28983 1726883132.83140: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883132.84726: done with get_vars() 28983 1726883132.84752: done getting variables 28983 1726883132.84798: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:45:32 -0400 (0:00:00.045) 0:02:42.846 ****** 28983 1726883132.84824: entering _queue_task() for managed_node2/service 28983 1726883132.85040: worker is 1 (out of 1 available) 28983 1726883132.85054: exiting _queue_task() for managed_node2/service 28983 1726883132.85067: done queuing things up, now waiting for results queue to drain 28983 1726883132.85069: waiting for pending results... 28983 1726883132.85262: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 28983 1726883132.85382: in run() - task 0affe814-3a2d-b16d-c0a7-00000000269f 28983 1726883132.85395: variable 'ansible_search_path' from source: unknown 28983 1726883132.85400: variable 'ansible_search_path' from source: unknown 28983 1726883132.85433: calling self._execute() 28983 1726883132.85520: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883132.85529: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883132.85543: variable 'omit' from source: magic vars 28983 1726883132.85857: variable 'ansible_distribution_major_version' from source: facts 28983 1726883132.85868: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883132.85974: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883132.86146: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883132.87923: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883132.87985: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883132.88020: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883132.88052: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883132.88075: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883132.88147: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883132.88170: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883132.88193: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883132.88225: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883132.88242: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883132.88285: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883132.88304: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883132.88324: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883132.88361: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883132.88373: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883132.88409: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883132.88429: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883132.88452: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883132.88489: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883132.88501: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883132.88646: variable 'network_connections' from source: include params 28983 1726883132.88656: variable 'interface' from source: play vars 28983 1726883132.88713: variable 'interface' from source: play vars 28983 1726883132.88777: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883132.88911: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883132.88951: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883132.88980: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883132.89009: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883132.89048: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883132.89066: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883132.89089: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883132.89113: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883132.89159: variable '__network_team_connections_defined' from source: role '' defaults 28983 1726883132.89368: variable 'network_connections' from source: include params 28983 1726883132.89372: variable 'interface' from source: play vars 28983 1726883132.89426: variable 'interface' from source: play vars 28983 1726883132.89449: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 28983 1726883132.89453: when evaluation is False, skipping this task 28983 1726883132.89456: _execute() done 28983 1726883132.89459: dumping result to json 28983 1726883132.89466: done dumping result, returning 28983 1726883132.89469: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affe814-3a2d-b16d-c0a7-00000000269f] 28983 1726883132.89478: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000269f 28983 1726883132.89573: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000269f 28983 1726883132.89583: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 28983 1726883132.89637: no more pending results, returning what we have 28983 1726883132.89640: results queue empty 28983 1726883132.89641: checking for any_errors_fatal 28983 1726883132.89648: done checking for any_errors_fatal 28983 1726883132.89649: checking for max_fail_percentage 28983 1726883132.89651: done checking for max_fail_percentage 28983 1726883132.89653: checking to see if all hosts have failed and the running result is not ok 28983 1726883132.89653: done checking to see if all hosts have failed 28983 1726883132.89654: getting the remaining hosts for this loop 28983 1726883132.89656: done getting the remaining hosts for this loop 28983 1726883132.89661: getting the next task for host managed_node2 28983 1726883132.89669: done getting next task for host managed_node2 28983 1726883132.89674: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 28983 1726883132.89680: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883132.89711: getting variables 28983 1726883132.89713: in VariableManager get_vars() 28983 1726883132.89755: Calling all_inventory to load vars for managed_node2 28983 1726883132.89758: Calling groups_inventory to load vars for managed_node2 28983 1726883132.89761: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883132.89769: Calling all_plugins_play to load vars for managed_node2 28983 1726883132.89772: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883132.89776: Calling groups_plugins_play to load vars for managed_node2 28983 1726883132.91015: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883132.92623: done with get_vars() 28983 1726883132.92649: done getting variables 28983 1726883132.92696: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:45:32 -0400 (0:00:00.078) 0:02:42.925 ****** 28983 1726883132.92722: entering _queue_task() for managed_node2/service 28983 1726883132.92945: worker is 1 (out of 1 available) 28983 1726883132.92958: exiting _queue_task() for managed_node2/service 28983 1726883132.92971: done queuing things up, now waiting for results queue to drain 28983 1726883132.92973: waiting for pending results... 28983 1726883132.93171: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 28983 1726883132.93300: in run() - task 0affe814-3a2d-b16d-c0a7-0000000026a0 28983 1726883132.93318: variable 'ansible_search_path' from source: unknown 28983 1726883132.93322: variable 'ansible_search_path' from source: unknown 28983 1726883132.93354: calling self._execute() 28983 1726883132.93438: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883132.93446: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883132.93457: variable 'omit' from source: magic vars 28983 1726883132.93783: variable 'ansible_distribution_major_version' from source: facts 28983 1726883132.93793: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883132.93936: variable 'network_provider' from source: set_fact 28983 1726883132.93941: variable 'network_state' from source: role '' defaults 28983 1726883132.93951: Evaluated conditional (network_provider == "nm" or network_state != {}): True 28983 1726883132.93958: variable 'omit' from source: magic vars 28983 1726883132.94021: variable 'omit' from source: magic vars 28983 1726883132.94047: variable 'network_service_name' from source: role '' defaults 28983 1726883132.94105: variable 'network_service_name' from source: role '' defaults 28983 1726883132.94192: variable '__network_provider_setup' from source: role '' defaults 28983 1726883132.94197: variable '__network_service_name_default_nm' from source: role '' defaults 28983 1726883132.94251: variable '__network_service_name_default_nm' from source: role '' defaults 28983 1726883132.94258: variable '__network_packages_default_nm' from source: role '' defaults 28983 1726883132.94314: variable '__network_packages_default_nm' from source: role '' defaults 28983 1726883132.94506: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883132.96459: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883132.96522: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883132.96554: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883132.96586: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883132.96612: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883132.96679: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883132.96706: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883132.96726: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883132.96759: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883132.96774: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883132.96813: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883132.96837: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883132.96858: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883132.96891: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883132.96904: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883132.97095: variable '__network_packages_default_gobject_packages' from source: role '' defaults 28983 1726883132.97194: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883132.97214: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883132.97235: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883132.97270: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883132.97285: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883132.97358: variable 'ansible_python' from source: facts 28983 1726883132.97376: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 28983 1726883132.97439: variable '__network_wpa_supplicant_required' from source: role '' defaults 28983 1726883132.97505: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 28983 1726883132.97611: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883132.97632: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883132.97654: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883132.97687: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883132.97702: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883132.97743: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883132.97766: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883132.97787: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883132.97823: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883132.97837: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883132.97950: variable 'network_connections' from source: include params 28983 1726883132.97958: variable 'interface' from source: play vars 28983 1726883132.98020: variable 'interface' from source: play vars 28983 1726883132.98181: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883132.98413: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883132.98417: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883132.98439: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883132.98627: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883132.98631: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883132.98636: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883132.98639: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883132.98678: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883132.98730: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883132.99100: variable 'network_connections' from source: include params 28983 1726883132.99108: variable 'interface' from source: play vars 28983 1726883132.99200: variable 'interface' from source: play vars 28983 1726883132.99238: variable '__network_packages_default_wireless' from source: role '' defaults 28983 1726883132.99341: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883132.99727: variable 'network_connections' from source: include params 28983 1726883132.99733: variable 'interface' from source: play vars 28983 1726883132.99818: variable 'interface' from source: play vars 28983 1726883132.99854: variable '__network_packages_default_team' from source: role '' defaults 28983 1726883132.99964: variable '__network_team_connections_defined' from source: role '' defaults 28983 1726883133.00268: variable 'network_connections' from source: include params 28983 1726883133.00275: variable 'interface' from source: play vars 28983 1726883133.00333: variable 'interface' from source: play vars 28983 1726883133.00378: variable '__network_service_name_default_initscripts' from source: role '' defaults 28983 1726883133.00431: variable '__network_service_name_default_initscripts' from source: role '' defaults 28983 1726883133.00439: variable '__network_packages_default_initscripts' from source: role '' defaults 28983 1726883133.00491: variable '__network_packages_default_initscripts' from source: role '' defaults 28983 1726883133.00679: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 28983 1726883133.01079: variable 'network_connections' from source: include params 28983 1726883133.01082: variable 'interface' from source: play vars 28983 1726883133.01133: variable 'interface' from source: play vars 28983 1726883133.01140: variable 'ansible_distribution' from source: facts 28983 1726883133.01144: variable '__network_rh_distros' from source: role '' defaults 28983 1726883133.01151: variable 'ansible_distribution_major_version' from source: facts 28983 1726883133.01167: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 28983 1726883133.01311: variable 'ansible_distribution' from source: facts 28983 1726883133.01314: variable '__network_rh_distros' from source: role '' defaults 28983 1726883133.01321: variable 'ansible_distribution_major_version' from source: facts 28983 1726883133.01327: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 28983 1726883133.01476: variable 'ansible_distribution' from source: facts 28983 1726883133.01480: variable '__network_rh_distros' from source: role '' defaults 28983 1726883133.01485: variable 'ansible_distribution_major_version' from source: facts 28983 1726883133.01514: variable 'network_provider' from source: set_fact 28983 1726883133.01535: variable 'omit' from source: magic vars 28983 1726883133.01559: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883133.01585: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883133.01602: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883133.01618: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883133.01628: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883133.01656: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883133.01659: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883133.01664: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883133.01746: Set connection var ansible_connection to ssh 28983 1726883133.01756: Set connection var ansible_shell_executable to /bin/sh 28983 1726883133.01765: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883133.01776: Set connection var ansible_timeout to 10 28983 1726883133.01782: Set connection var ansible_pipelining to False 28983 1726883133.01784: Set connection var ansible_shell_type to sh 28983 1726883133.01805: variable 'ansible_shell_executable' from source: unknown 28983 1726883133.01808: variable 'ansible_connection' from source: unknown 28983 1726883133.01813: variable 'ansible_module_compression' from source: unknown 28983 1726883133.01815: variable 'ansible_shell_type' from source: unknown 28983 1726883133.01818: variable 'ansible_shell_executable' from source: unknown 28983 1726883133.01826: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883133.01829: variable 'ansible_pipelining' from source: unknown 28983 1726883133.01831: variable 'ansible_timeout' from source: unknown 28983 1726883133.01837: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883133.01919: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883133.01933: variable 'omit' from source: magic vars 28983 1726883133.01942: starting attempt loop 28983 1726883133.01945: running the handler 28983 1726883133.02009: variable 'ansible_facts' from source: unknown 28983 1726883133.02646: _low_level_execute_command(): starting 28983 1726883133.02654: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726883133.03210: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883133.03215: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883133.03219: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883133.03234: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883133.03256: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883133.03341: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883133.05112: stdout chunk (state=3): >>>/root <<< 28983 1726883133.05223: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883133.05275: stderr chunk (state=3): >>><<< 28983 1726883133.05283: stdout chunk (state=3): >>><<< 28983 1726883133.05302: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883133.05313: _low_level_execute_command(): starting 28983 1726883133.05319: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726883133.0530283-34697-273490632447603 `" && echo ansible-tmp-1726883133.0530283-34697-273490632447603="` echo /root/.ansible/tmp/ansible-tmp-1726883133.0530283-34697-273490632447603 `" ) && sleep 0' 28983 1726883133.05774: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883133.05778: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726883133.05780: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration <<< 28983 1726883133.05783: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found <<< 28983 1726883133.05785: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883133.05840: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883133.05852: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883133.05917: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883133.07953: stdout chunk (state=3): >>>ansible-tmp-1726883133.0530283-34697-273490632447603=/root/.ansible/tmp/ansible-tmp-1726883133.0530283-34697-273490632447603 <<< 28983 1726883133.08069: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883133.08117: stderr chunk (state=3): >>><<< 28983 1726883133.08120: stdout chunk (state=3): >>><<< 28983 1726883133.08137: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726883133.0530283-34697-273490632447603=/root/.ansible/tmp/ansible-tmp-1726883133.0530283-34697-273490632447603 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883133.08164: variable 'ansible_module_compression' from source: unknown 28983 1726883133.08205: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 28983 1726883133.08250: variable 'ansible_facts' from source: unknown 28983 1726883133.08392: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726883133.0530283-34697-273490632447603/AnsiballZ_systemd.py 28983 1726883133.08510: Sending initial data 28983 1726883133.08514: Sent initial data (156 bytes) 28983 1726883133.08939: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883133.08983: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883133.08987: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883133.08992: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883133.08995: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found <<< 28983 1726883133.08997: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883133.09040: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883133.09044: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883133.09116: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883133.10778: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 28983 1726883133.10787: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726883133.10848: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726883133.10923: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmp_9zdw2cl /root/.ansible/tmp/ansible-tmp-1726883133.0530283-34697-273490632447603/AnsiballZ_systemd.py <<< 28983 1726883133.10926: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726883133.0530283-34697-273490632447603/AnsiballZ_systemd.py" <<< 28983 1726883133.10990: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmp_9zdw2cl" to remote "/root/.ansible/tmp/ansible-tmp-1726883133.0530283-34697-273490632447603/AnsiballZ_systemd.py" <<< 28983 1726883133.10992: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726883133.0530283-34697-273490632447603/AnsiballZ_systemd.py" <<< 28983 1726883133.12827: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883133.12885: stderr chunk (state=3): >>><<< 28983 1726883133.12890: stdout chunk (state=3): >>><<< 28983 1726883133.12910: done transferring module to remote 28983 1726883133.12918: _low_level_execute_command(): starting 28983 1726883133.12923: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726883133.0530283-34697-273490632447603/ /root/.ansible/tmp/ansible-tmp-1726883133.0530283-34697-273490632447603/AnsiballZ_systemd.py && sleep 0' 28983 1726883133.13328: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883133.13364: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726883133.13368: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883133.13370: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883133.13421: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883133.13428: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883133.13496: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883133.15383: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883133.15427: stderr chunk (state=3): >>><<< 28983 1726883133.15430: stdout chunk (state=3): >>><<< 28983 1726883133.15443: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883133.15447: _low_level_execute_command(): starting 28983 1726883133.15454: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726883133.0530283-34697-273490632447603/AnsiballZ_systemd.py && sleep 0' 28983 1726883133.15882: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883133.15885: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726883133.15888: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 28983 1726883133.15890: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883133.15892: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883133.15940: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883133.15955: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883133.16022: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883133.48438: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3307", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ExecMainStartTimestampMonotonic": "237115079", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3307", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4837", "MemoryCurrent": "4468736", "MemoryAvailable": "infinity", "CPUUsageNSec": "1773328000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target NetworkManager-wait-online.service cloud-init.service network.target network.service multi-user.target", "After": "systemd-journald.socket dbus-broker.service dbus.socket system.slice cloud-init-local.service sysinit.target network-pre.target basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:40:53 EDT", "StateChangeTimestampMonotonic": "816797668", "InactiveExitTimestamp": "Fri 2024-09-20 21:31:13 EDT", "InactiveExitTimestampMonotonic": "237115438", "ActiveEnterTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ActiveEnterTimestampMonotonic": "237210316", "ActiveExitTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ActiveExitTimestampMonotonic": "237082173", "InactiveEnterTimestamp": "Fri 2024-09-20 21:31:13 EDT", "InactiveEnterTimestampMonotonic": "237109124", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ConditionTimestampMonotonic": "237109810", "AssertTimestamp": "Fri 2024-09-20 21:31:13 EDT", "AssertTimestampMonotonic": "237109813", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ac5f6302898c463683c4bdf580ab7e3e", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 28983 1726883133.50377: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. <<< 28983 1726883133.50442: stderr chunk (state=3): >>><<< 28983 1726883133.50445: stdout chunk (state=3): >>><<< 28983 1726883133.50462: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3307", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ExecMainStartTimestampMonotonic": "237115079", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3307", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4837", "MemoryCurrent": "4468736", "MemoryAvailable": "infinity", "CPUUsageNSec": "1773328000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target NetworkManager-wait-online.service cloud-init.service network.target network.service multi-user.target", "After": "systemd-journald.socket dbus-broker.service dbus.socket system.slice cloud-init-local.service sysinit.target network-pre.target basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:40:53 EDT", "StateChangeTimestampMonotonic": "816797668", "InactiveExitTimestamp": "Fri 2024-09-20 21:31:13 EDT", "InactiveExitTimestampMonotonic": "237115438", "ActiveEnterTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ActiveEnterTimestampMonotonic": "237210316", "ActiveExitTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ActiveExitTimestampMonotonic": "237082173", "InactiveEnterTimestamp": "Fri 2024-09-20 21:31:13 EDT", "InactiveEnterTimestampMonotonic": "237109124", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:31:13 EDT", "ConditionTimestampMonotonic": "237109810", "AssertTimestamp": "Fri 2024-09-20 21:31:13 EDT", "AssertTimestampMonotonic": "237109813", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ac5f6302898c463683c4bdf580ab7e3e", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726883133.50655: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726883133.0530283-34697-273490632447603/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726883133.50674: _low_level_execute_command(): starting 28983 1726883133.50678: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726883133.0530283-34697-273490632447603/ > /dev/null 2>&1 && sleep 0' 28983 1726883133.51178: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883133.51182: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726883133.51185: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883133.51187: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883133.51189: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883133.51238: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883133.51246: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883133.51318: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883133.53253: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883133.53308: stderr chunk (state=3): >>><<< 28983 1726883133.53311: stdout chunk (state=3): >>><<< 28983 1726883133.53326: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883133.53339: handler run complete 28983 1726883133.53389: attempt loop complete, returning result 28983 1726883133.53392: _execute() done 28983 1726883133.53398: dumping result to json 28983 1726883133.53413: done dumping result, returning 28983 1726883133.53425: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affe814-3a2d-b16d-c0a7-0000000026a0] 28983 1726883133.53427: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000026a0 28983 1726883133.54023: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000026a0 28983 1726883133.54027: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28983 1726883133.54082: no more pending results, returning what we have 28983 1726883133.54085: results queue empty 28983 1726883133.54085: checking for any_errors_fatal 28983 1726883133.54089: done checking for any_errors_fatal 28983 1726883133.54089: checking for max_fail_percentage 28983 1726883133.54091: done checking for max_fail_percentage 28983 1726883133.54091: checking to see if all hosts have failed and the running result is not ok 28983 1726883133.54092: done checking to see if all hosts have failed 28983 1726883133.54093: getting the remaining hosts for this loop 28983 1726883133.54094: done getting the remaining hosts for this loop 28983 1726883133.54097: getting the next task for host managed_node2 28983 1726883133.54102: done getting next task for host managed_node2 28983 1726883133.54105: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 28983 1726883133.54110: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883133.54119: getting variables 28983 1726883133.54120: in VariableManager get_vars() 28983 1726883133.54151: Calling all_inventory to load vars for managed_node2 28983 1726883133.54153: Calling groups_inventory to load vars for managed_node2 28983 1726883133.54155: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883133.54162: Calling all_plugins_play to load vars for managed_node2 28983 1726883133.54164: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883133.54166: Calling groups_plugins_play to load vars for managed_node2 28983 1726883133.55327: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883133.56936: done with get_vars() 28983 1726883133.56960: done getting variables 28983 1726883133.57012: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:45:33 -0400 (0:00:00.643) 0:02:43.568 ****** 28983 1726883133.57048: entering _queue_task() for managed_node2/service 28983 1726883133.57319: worker is 1 (out of 1 available) 28983 1726883133.57335: exiting _queue_task() for managed_node2/service 28983 1726883133.57350: done queuing things up, now waiting for results queue to drain 28983 1726883133.57352: waiting for pending results... 28983 1726883133.57557: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 28983 1726883133.57694: in run() - task 0affe814-3a2d-b16d-c0a7-0000000026a1 28983 1726883133.57704: variable 'ansible_search_path' from source: unknown 28983 1726883133.57708: variable 'ansible_search_path' from source: unknown 28983 1726883133.57742: calling self._execute() 28983 1726883133.57830: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883133.57838: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883133.57849: variable 'omit' from source: magic vars 28983 1726883133.58187: variable 'ansible_distribution_major_version' from source: facts 28983 1726883133.58198: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883133.58301: variable 'network_provider' from source: set_fact 28983 1726883133.58307: Evaluated conditional (network_provider == "nm"): True 28983 1726883133.58392: variable '__network_wpa_supplicant_required' from source: role '' defaults 28983 1726883133.58468: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 28983 1726883133.58621: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883133.60495: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883133.60554: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883133.60587: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883133.60616: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883133.60646: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883133.60724: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883133.60753: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883133.60777: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883133.60810: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883133.60823: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883133.60870: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883133.60893: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883133.60913: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883133.60947: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883133.60966: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883133.61001: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883133.61020: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883133.61042: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883133.61077: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883133.61091: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883133.61212: variable 'network_connections' from source: include params 28983 1726883133.61223: variable 'interface' from source: play vars 28983 1726883133.61282: variable 'interface' from source: play vars 28983 1726883133.61343: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 28983 1726883133.61479: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 28983 1726883133.61515: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 28983 1726883133.61544: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 28983 1726883133.61571: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 28983 1726883133.61612: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 28983 1726883133.61633: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 28983 1726883133.61655: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883133.61678: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 28983 1726883133.61721: variable '__network_wireless_connections_defined' from source: role '' defaults 28983 1726883133.61934: variable 'network_connections' from source: include params 28983 1726883133.61940: variable 'interface' from source: play vars 28983 1726883133.61994: variable 'interface' from source: play vars 28983 1726883133.62020: Evaluated conditional (__network_wpa_supplicant_required): False 28983 1726883133.62023: when evaluation is False, skipping this task 28983 1726883133.62026: _execute() done 28983 1726883133.62031: dumping result to json 28983 1726883133.62039: done dumping result, returning 28983 1726883133.62045: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affe814-3a2d-b16d-c0a7-0000000026a1] 28983 1726883133.62057: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000026a1 28983 1726883133.62150: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000026a1 28983 1726883133.62154: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 28983 1726883133.62207: no more pending results, returning what we have 28983 1726883133.62211: results queue empty 28983 1726883133.62212: checking for any_errors_fatal 28983 1726883133.62251: done checking for any_errors_fatal 28983 1726883133.62252: checking for max_fail_percentage 28983 1726883133.62254: done checking for max_fail_percentage 28983 1726883133.62255: checking to see if all hosts have failed and the running result is not ok 28983 1726883133.62256: done checking to see if all hosts have failed 28983 1726883133.62257: getting the remaining hosts for this loop 28983 1726883133.62261: done getting the remaining hosts for this loop 28983 1726883133.62266: getting the next task for host managed_node2 28983 1726883133.62275: done getting next task for host managed_node2 28983 1726883133.62280: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 28983 1726883133.62286: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883133.62315: getting variables 28983 1726883133.62316: in VariableManager get_vars() 28983 1726883133.62370: Calling all_inventory to load vars for managed_node2 28983 1726883133.62373: Calling groups_inventory to load vars for managed_node2 28983 1726883133.62376: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883133.62385: Calling all_plugins_play to load vars for managed_node2 28983 1726883133.62388: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883133.62392: Calling groups_plugins_play to load vars for managed_node2 28983 1726883133.63745: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883133.65358: done with get_vars() 28983 1726883133.65383: done getting variables 28983 1726883133.65431: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:45:33 -0400 (0:00:00.084) 0:02:43.652 ****** 28983 1726883133.65460: entering _queue_task() for managed_node2/service 28983 1726883133.65713: worker is 1 (out of 1 available) 28983 1726883133.65727: exiting _queue_task() for managed_node2/service 28983 1726883133.65741: done queuing things up, now waiting for results queue to drain 28983 1726883133.65743: waiting for pending results... 28983 1726883133.65946: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 28983 1726883133.66061: in run() - task 0affe814-3a2d-b16d-c0a7-0000000026a2 28983 1726883133.66076: variable 'ansible_search_path' from source: unknown 28983 1726883133.66080: variable 'ansible_search_path' from source: unknown 28983 1726883133.66116: calling self._execute() 28983 1726883133.66205: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883133.66216: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883133.66227: variable 'omit' from source: magic vars 28983 1726883133.66555: variable 'ansible_distribution_major_version' from source: facts 28983 1726883133.66566: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883133.66671: variable 'network_provider' from source: set_fact 28983 1726883133.66680: Evaluated conditional (network_provider == "initscripts"): False 28983 1726883133.66683: when evaluation is False, skipping this task 28983 1726883133.66685: _execute() done 28983 1726883133.66690: dumping result to json 28983 1726883133.66695: done dumping result, returning 28983 1726883133.66703: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [0affe814-3a2d-b16d-c0a7-0000000026a2] 28983 1726883133.66709: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000026a2 28983 1726883133.66810: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000026a2 28983 1726883133.66814: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 28983 1726883133.66865: no more pending results, returning what we have 28983 1726883133.66869: results queue empty 28983 1726883133.66870: checking for any_errors_fatal 28983 1726883133.66876: done checking for any_errors_fatal 28983 1726883133.66877: checking for max_fail_percentage 28983 1726883133.66879: done checking for max_fail_percentage 28983 1726883133.66881: checking to see if all hosts have failed and the running result is not ok 28983 1726883133.66882: done checking to see if all hosts have failed 28983 1726883133.66883: getting the remaining hosts for this loop 28983 1726883133.66885: done getting the remaining hosts for this loop 28983 1726883133.66890: getting the next task for host managed_node2 28983 1726883133.66898: done getting next task for host managed_node2 28983 1726883133.66903: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 28983 1726883133.66908: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883133.66937: getting variables 28983 1726883133.66939: in VariableManager get_vars() 28983 1726883133.66979: Calling all_inventory to load vars for managed_node2 28983 1726883133.66982: Calling groups_inventory to load vars for managed_node2 28983 1726883133.66984: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883133.66993: Calling all_plugins_play to load vars for managed_node2 28983 1726883133.66997: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883133.67000: Calling groups_plugins_play to load vars for managed_node2 28983 1726883133.68220: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883133.69916: done with get_vars() 28983 1726883133.69943: done getting variables 28983 1726883133.69990: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:45:33 -0400 (0:00:00.045) 0:02:43.697 ****** 28983 1726883133.70019: entering _queue_task() for managed_node2/copy 28983 1726883133.70260: worker is 1 (out of 1 available) 28983 1726883133.70275: exiting _queue_task() for managed_node2/copy 28983 1726883133.70289: done queuing things up, now waiting for results queue to drain 28983 1726883133.70291: waiting for pending results... 28983 1726883133.70495: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 28983 1726883133.70603: in run() - task 0affe814-3a2d-b16d-c0a7-0000000026a3 28983 1726883133.70617: variable 'ansible_search_path' from source: unknown 28983 1726883133.70622: variable 'ansible_search_path' from source: unknown 28983 1726883133.70657: calling self._execute() 28983 1726883133.70743: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883133.70754: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883133.70763: variable 'omit' from source: magic vars 28983 1726883133.71095: variable 'ansible_distribution_major_version' from source: facts 28983 1726883133.71106: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883133.71207: variable 'network_provider' from source: set_fact 28983 1726883133.71215: Evaluated conditional (network_provider == "initscripts"): False 28983 1726883133.71218: when evaluation is False, skipping this task 28983 1726883133.71221: _execute() done 28983 1726883133.71224: dumping result to json 28983 1726883133.71229: done dumping result, returning 28983 1726883133.71239: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affe814-3a2d-b16d-c0a7-0000000026a3] 28983 1726883133.71245: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000026a3 28983 1726883133.71351: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000026a3 28983 1726883133.71354: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 28983 1726883133.71412: no more pending results, returning what we have 28983 1726883133.71416: results queue empty 28983 1726883133.71417: checking for any_errors_fatal 28983 1726883133.71424: done checking for any_errors_fatal 28983 1726883133.71424: checking for max_fail_percentage 28983 1726883133.71426: done checking for max_fail_percentage 28983 1726883133.71427: checking to see if all hosts have failed and the running result is not ok 28983 1726883133.71428: done checking to see if all hosts have failed 28983 1726883133.71429: getting the remaining hosts for this loop 28983 1726883133.71431: done getting the remaining hosts for this loop 28983 1726883133.71437: getting the next task for host managed_node2 28983 1726883133.71446: done getting next task for host managed_node2 28983 1726883133.71451: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 28983 1726883133.71456: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883133.71491: getting variables 28983 1726883133.71493: in VariableManager get_vars() 28983 1726883133.71537: Calling all_inventory to load vars for managed_node2 28983 1726883133.71539: Calling groups_inventory to load vars for managed_node2 28983 1726883133.71541: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883133.71548: Calling all_plugins_play to load vars for managed_node2 28983 1726883133.71550: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883133.71552: Calling groups_plugins_play to load vars for managed_node2 28983 1726883133.72754: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883133.74338: done with get_vars() 28983 1726883133.74360: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:45:33 -0400 (0:00:00.044) 0:02:43.742 ****** 28983 1726883133.74429: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 28983 1726883133.74650: worker is 1 (out of 1 available) 28983 1726883133.74664: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 28983 1726883133.74677: done queuing things up, now waiting for results queue to drain 28983 1726883133.74680: waiting for pending results... 28983 1726883133.74872: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 28983 1726883133.74977: in run() - task 0affe814-3a2d-b16d-c0a7-0000000026a4 28983 1726883133.74993: variable 'ansible_search_path' from source: unknown 28983 1726883133.74998: variable 'ansible_search_path' from source: unknown 28983 1726883133.75031: calling self._execute() 28983 1726883133.75115: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883133.75127: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883133.75136: variable 'omit' from source: magic vars 28983 1726883133.75455: variable 'ansible_distribution_major_version' from source: facts 28983 1726883133.75461: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883133.75470: variable 'omit' from source: magic vars 28983 1726883133.75527: variable 'omit' from source: magic vars 28983 1726883133.75665: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883133.77681: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883133.77733: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883133.77768: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883133.77801: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883133.77824: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883133.77890: variable 'network_provider' from source: set_fact 28983 1726883133.78000: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883133.78023: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883133.78046: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883133.78079: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883133.78095: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883133.78157: variable 'omit' from source: magic vars 28983 1726883133.78248: variable 'omit' from source: magic vars 28983 1726883133.78337: variable 'network_connections' from source: include params 28983 1726883133.78350: variable 'interface' from source: play vars 28983 1726883133.78403: variable 'interface' from source: play vars 28983 1726883133.78528: variable 'omit' from source: magic vars 28983 1726883133.78538: variable '__lsr_ansible_managed' from source: task vars 28983 1726883133.78589: variable '__lsr_ansible_managed' from source: task vars 28983 1726883133.78734: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 28983 1726883133.78932: Loaded config def from plugin (lookup/template) 28983 1726883133.78937: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 28983 1726883133.78963: File lookup term: get_ansible_managed.j2 28983 1726883133.78967: variable 'ansible_search_path' from source: unknown 28983 1726883133.78975: evaluation_path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 28983 1726883133.78987: search_path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 28983 1726883133.79002: variable 'ansible_search_path' from source: unknown 28983 1726883133.89712: variable 'ansible_managed' from source: unknown 28983 1726883133.89859: variable 'omit' from source: magic vars 28983 1726883133.89883: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883133.89903: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883133.89916: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883133.89930: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883133.89941: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883133.89962: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883133.89965: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883133.89970: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883133.90041: Set connection var ansible_connection to ssh 28983 1726883133.90053: Set connection var ansible_shell_executable to /bin/sh 28983 1726883133.90162: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883133.90167: Set connection var ansible_timeout to 10 28983 1726883133.90170: Set connection var ansible_pipelining to False 28983 1726883133.90172: Set connection var ansible_shell_type to sh 28983 1726883133.90175: variable 'ansible_shell_executable' from source: unknown 28983 1726883133.90177: variable 'ansible_connection' from source: unknown 28983 1726883133.90179: variable 'ansible_module_compression' from source: unknown 28983 1726883133.90181: variable 'ansible_shell_type' from source: unknown 28983 1726883133.90183: variable 'ansible_shell_executable' from source: unknown 28983 1726883133.90186: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883133.90189: variable 'ansible_pipelining' from source: unknown 28983 1726883133.90191: variable 'ansible_timeout' from source: unknown 28983 1726883133.90193: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883133.90237: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28983 1726883133.90247: variable 'omit' from source: magic vars 28983 1726883133.90253: starting attempt loop 28983 1726883133.90256: running the handler 28983 1726883133.90269: _low_level_execute_command(): starting 28983 1726883133.90272: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726883133.90786: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883133.90790: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883133.90793: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883133.90795: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883133.90860: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883133.90863: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883133.90865: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883133.90938: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883133.92718: stdout chunk (state=3): >>>/root <<< 28983 1726883133.92829: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883133.92884: stderr chunk (state=3): >>><<< 28983 1726883133.92888: stdout chunk (state=3): >>><<< 28983 1726883133.92910: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883133.92920: _low_level_execute_command(): starting 28983 1726883133.92927: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726883133.9290998-34711-81758946889093 `" && echo ansible-tmp-1726883133.9290998-34711-81758946889093="` echo /root/.ansible/tmp/ansible-tmp-1726883133.9290998-34711-81758946889093 `" ) && sleep 0' 28983 1726883133.93376: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883133.93380: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883133.93382: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883133.93385: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883133.93432: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883133.93443: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883133.93518: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883133.95533: stdout chunk (state=3): >>>ansible-tmp-1726883133.9290998-34711-81758946889093=/root/.ansible/tmp/ansible-tmp-1726883133.9290998-34711-81758946889093 <<< 28983 1726883133.95655: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883133.95699: stderr chunk (state=3): >>><<< 28983 1726883133.95702: stdout chunk (state=3): >>><<< 28983 1726883133.95716: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726883133.9290998-34711-81758946889093=/root/.ansible/tmp/ansible-tmp-1726883133.9290998-34711-81758946889093 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883133.95755: variable 'ansible_module_compression' from source: unknown 28983 1726883133.95788: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 28983 1726883133.95826: variable 'ansible_facts' from source: unknown 28983 1726883133.95920: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726883133.9290998-34711-81758946889093/AnsiballZ_network_connections.py 28983 1726883133.96030: Sending initial data 28983 1726883133.96033: Sent initial data (167 bytes) 28983 1726883133.96495: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883133.96498: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726883133.96501: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883133.96507: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883133.96554: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883133.96557: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883133.96631: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883133.98264: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 28983 1726883133.98268: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726883133.98331: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726883133.98401: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpj2i4rlp0 /root/.ansible/tmp/ansible-tmp-1726883133.9290998-34711-81758946889093/AnsiballZ_network_connections.py <<< 28983 1726883133.98404: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726883133.9290998-34711-81758946889093/AnsiballZ_network_connections.py" <<< 28983 1726883133.98467: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpj2i4rlp0" to remote "/root/.ansible/tmp/ansible-tmp-1726883133.9290998-34711-81758946889093/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726883133.9290998-34711-81758946889093/AnsiballZ_network_connections.py" <<< 28983 1726883133.99702: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883133.99762: stderr chunk (state=3): >>><<< 28983 1726883133.99766: stdout chunk (state=3): >>><<< 28983 1726883133.99786: done transferring module to remote 28983 1726883133.99795: _low_level_execute_command(): starting 28983 1726883133.99800: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726883133.9290998-34711-81758946889093/ /root/.ansible/tmp/ansible-tmp-1726883133.9290998-34711-81758946889093/AnsiballZ_network_connections.py && sleep 0' 28983 1726883134.00290: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883134.00293: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883134.00295: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883134.00297: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883134.00527: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883134.00539: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883134.00640: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883134.02546: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883134.02589: stderr chunk (state=3): >>><<< 28983 1726883134.02594: stdout chunk (state=3): >>><<< 28983 1726883134.02614: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883134.02617: _low_level_execute_command(): starting 28983 1726883134.02621: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726883133.9290998-34711-81758946889093/AnsiballZ_network_connections.py && sleep 0' 28983 1726883134.03031: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883134.03068: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883134.03075: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883134.03079: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883134.03126: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883134.03132: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883134.03207: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883134.32553: stdout chunk (state=3): >>> {"changed": false, "warnings": [], "stderr": "[002] #0, state:down persistent_state:absent, 'statebr': no connection matches 'statebr' to delete\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 28983 1726883134.34439: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. <<< 28983 1726883134.34505: stderr chunk (state=3): >>><<< 28983 1726883134.34509: stdout chunk (state=3): >>><<< 28983 1726883134.34527: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "warnings": [], "stderr": "[002] #0, state:down persistent_state:absent, 'statebr': no connection matches 'statebr' to delete\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726883134.34564: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726883133.9290998-34711-81758946889093/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726883134.34573: _low_level_execute_command(): starting 28983 1726883134.34581: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726883133.9290998-34711-81758946889093/ > /dev/null 2>&1 && sleep 0' 28983 1726883134.35066: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883134.35069: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883134.35075: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883134.35077: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883134.35079: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883134.35142: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883134.35145: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883134.35147: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883134.35216: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883134.37173: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883134.37219: stderr chunk (state=3): >>><<< 28983 1726883134.37222: stdout chunk (state=3): >>><<< 28983 1726883134.37238: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883134.37246: handler run complete 28983 1726883134.37277: attempt loop complete, returning result 28983 1726883134.37281: _execute() done 28983 1726883134.37284: dumping result to json 28983 1726883134.37287: done dumping result, returning 28983 1726883134.37296: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affe814-3a2d-b16d-c0a7-0000000026a4] 28983 1726883134.37300: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000026a4 28983 1726883134.37413: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000026a4 28983 1726883134.37416: WORKER PROCESS EXITING ok: [managed_node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": false } STDERR: [002] #0, state:down persistent_state:absent, 'statebr': no connection matches 'statebr' to delete 28983 1726883134.37550: no more pending results, returning what we have 28983 1726883134.37554: results queue empty 28983 1726883134.37555: checking for any_errors_fatal 28983 1726883134.37562: done checking for any_errors_fatal 28983 1726883134.37563: checking for max_fail_percentage 28983 1726883134.37565: done checking for max_fail_percentage 28983 1726883134.37571: checking to see if all hosts have failed and the running result is not ok 28983 1726883134.37575: done checking to see if all hosts have failed 28983 1726883134.37575: getting the remaining hosts for this loop 28983 1726883134.37578: done getting the remaining hosts for this loop 28983 1726883134.37582: getting the next task for host managed_node2 28983 1726883134.37590: done getting next task for host managed_node2 28983 1726883134.37594: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 28983 1726883134.37599: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883134.37615: getting variables 28983 1726883134.37616: in VariableManager get_vars() 28983 1726883134.37675: Calling all_inventory to load vars for managed_node2 28983 1726883134.37678: Calling groups_inventory to load vars for managed_node2 28983 1726883134.37682: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883134.37691: Calling all_plugins_play to load vars for managed_node2 28983 1726883134.37695: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883134.37698: Calling groups_plugins_play to load vars for managed_node2 28983 1726883134.39192: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883134.40785: done with get_vars() 28983 1726883134.40812: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:45:34 -0400 (0:00:00.664) 0:02:44.406 ****** 28983 1726883134.40888: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 28983 1726883134.41161: worker is 1 (out of 1 available) 28983 1726883134.41176: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 28983 1726883134.41191: done queuing things up, now waiting for results queue to drain 28983 1726883134.41193: waiting for pending results... 28983 1726883134.41415: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 28983 1726883134.41542: in run() - task 0affe814-3a2d-b16d-c0a7-0000000026a5 28983 1726883134.41556: variable 'ansible_search_path' from source: unknown 28983 1726883134.41560: variable 'ansible_search_path' from source: unknown 28983 1726883134.41595: calling self._execute() 28983 1726883134.41680: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883134.41686: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883134.41700: variable 'omit' from source: magic vars 28983 1726883134.42020: variable 'ansible_distribution_major_version' from source: facts 28983 1726883134.42030: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883134.42140: variable 'network_state' from source: role '' defaults 28983 1726883134.42146: Evaluated conditional (network_state != {}): False 28983 1726883134.42150: when evaluation is False, skipping this task 28983 1726883134.42153: _execute() done 28983 1726883134.42159: dumping result to json 28983 1726883134.42163: done dumping result, returning 28983 1726883134.42174: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [0affe814-3a2d-b16d-c0a7-0000000026a5] 28983 1726883134.42177: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000026a5 28983 1726883134.42278: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000026a5 28983 1726883134.42281: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 28983 1726883134.42352: no more pending results, returning what we have 28983 1726883134.42356: results queue empty 28983 1726883134.42357: checking for any_errors_fatal 28983 1726883134.42367: done checking for any_errors_fatal 28983 1726883134.42368: checking for max_fail_percentage 28983 1726883134.42370: done checking for max_fail_percentage 28983 1726883134.42371: checking to see if all hosts have failed and the running result is not ok 28983 1726883134.42374: done checking to see if all hosts have failed 28983 1726883134.42375: getting the remaining hosts for this loop 28983 1726883134.42377: done getting the remaining hosts for this loop 28983 1726883134.42382: getting the next task for host managed_node2 28983 1726883134.42397: done getting next task for host managed_node2 28983 1726883134.42403: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 28983 1726883134.42408: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883134.42436: getting variables 28983 1726883134.42438: in VariableManager get_vars() 28983 1726883134.42482: Calling all_inventory to load vars for managed_node2 28983 1726883134.42485: Calling groups_inventory to load vars for managed_node2 28983 1726883134.42488: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883134.42496: Calling all_plugins_play to load vars for managed_node2 28983 1726883134.42505: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883134.42508: Calling groups_plugins_play to load vars for managed_node2 28983 1726883134.43728: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883134.49877: done with get_vars() 28983 1726883134.49900: done getting variables 28983 1726883134.49944: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:45:34 -0400 (0:00:00.090) 0:02:44.497 ****** 28983 1726883134.49968: entering _queue_task() for managed_node2/debug 28983 1726883134.50240: worker is 1 (out of 1 available) 28983 1726883134.50255: exiting _queue_task() for managed_node2/debug 28983 1726883134.50267: done queuing things up, now waiting for results queue to drain 28983 1726883134.50269: waiting for pending results... 28983 1726883134.50475: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 28983 1726883134.50618: in run() - task 0affe814-3a2d-b16d-c0a7-0000000026a6 28983 1726883134.50634: variable 'ansible_search_path' from source: unknown 28983 1726883134.50638: variable 'ansible_search_path' from source: unknown 28983 1726883134.50676: calling self._execute() 28983 1726883134.50769: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883134.50778: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883134.50788: variable 'omit' from source: magic vars 28983 1726883134.51125: variable 'ansible_distribution_major_version' from source: facts 28983 1726883134.51137: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883134.51144: variable 'omit' from source: magic vars 28983 1726883134.51206: variable 'omit' from source: magic vars 28983 1726883134.51238: variable 'omit' from source: magic vars 28983 1726883134.51281: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883134.51313: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883134.51335: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883134.51351: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883134.51362: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883134.51394: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883134.51398: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883134.51403: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883134.51484: Set connection var ansible_connection to ssh 28983 1726883134.51493: Set connection var ansible_shell_executable to /bin/sh 28983 1726883134.51502: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883134.51511: Set connection var ansible_timeout to 10 28983 1726883134.51518: Set connection var ansible_pipelining to False 28983 1726883134.51521: Set connection var ansible_shell_type to sh 28983 1726883134.51544: variable 'ansible_shell_executable' from source: unknown 28983 1726883134.51547: variable 'ansible_connection' from source: unknown 28983 1726883134.51550: variable 'ansible_module_compression' from source: unknown 28983 1726883134.51554: variable 'ansible_shell_type' from source: unknown 28983 1726883134.51558: variable 'ansible_shell_executable' from source: unknown 28983 1726883134.51561: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883134.51567: variable 'ansible_pipelining' from source: unknown 28983 1726883134.51569: variable 'ansible_timeout' from source: unknown 28983 1726883134.51594: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883134.51703: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883134.51709: variable 'omit' from source: magic vars 28983 1726883134.51716: starting attempt loop 28983 1726883134.51718: running the handler 28983 1726883134.51836: variable '__network_connections_result' from source: set_fact 28983 1726883134.51882: handler run complete 28983 1726883134.51898: attempt loop complete, returning result 28983 1726883134.51901: _execute() done 28983 1726883134.51903: dumping result to json 28983 1726883134.51909: done dumping result, returning 28983 1726883134.51920: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affe814-3a2d-b16d-c0a7-0000000026a6] 28983 1726883134.51924: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000026a6 28983 1726883134.52039: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000026a6 28983 1726883134.52042: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:down persistent_state:absent, 'statebr': no connection matches 'statebr' to delete" ] } 28983 1726883134.52129: no more pending results, returning what we have 28983 1726883134.52133: results queue empty 28983 1726883134.52139: checking for any_errors_fatal 28983 1726883134.52147: done checking for any_errors_fatal 28983 1726883134.52148: checking for max_fail_percentage 28983 1726883134.52150: done checking for max_fail_percentage 28983 1726883134.52154: checking to see if all hosts have failed and the running result is not ok 28983 1726883134.52155: done checking to see if all hosts have failed 28983 1726883134.52156: getting the remaining hosts for this loop 28983 1726883134.52158: done getting the remaining hosts for this loop 28983 1726883134.52162: getting the next task for host managed_node2 28983 1726883134.52170: done getting next task for host managed_node2 28983 1726883134.52176: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 28983 1726883134.52182: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883134.52196: getting variables 28983 1726883134.52197: in VariableManager get_vars() 28983 1726883134.52238: Calling all_inventory to load vars for managed_node2 28983 1726883134.52241: Calling groups_inventory to load vars for managed_node2 28983 1726883134.52244: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883134.52256: Calling all_plugins_play to load vars for managed_node2 28983 1726883134.52263: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883134.52266: Calling groups_plugins_play to load vars for managed_node2 28983 1726883134.53503: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883134.55119: done with get_vars() 28983 1726883134.55143: done getting variables 28983 1726883134.55191: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:45:34 -0400 (0:00:00.052) 0:02:44.550 ****** 28983 1726883134.55225: entering _queue_task() for managed_node2/debug 28983 1726883134.55462: worker is 1 (out of 1 available) 28983 1726883134.55480: exiting _queue_task() for managed_node2/debug 28983 1726883134.55493: done queuing things up, now waiting for results queue to drain 28983 1726883134.55495: waiting for pending results... 28983 1726883134.55678: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 28983 1726883134.55806: in run() - task 0affe814-3a2d-b16d-c0a7-0000000026a7 28983 1726883134.55818: variable 'ansible_search_path' from source: unknown 28983 1726883134.55822: variable 'ansible_search_path' from source: unknown 28983 1726883134.55856: calling self._execute() 28983 1726883134.55936: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883134.55949: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883134.55956: variable 'omit' from source: magic vars 28983 1726883134.56270: variable 'ansible_distribution_major_version' from source: facts 28983 1726883134.56287: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883134.56291: variable 'omit' from source: magic vars 28983 1726883134.56342: variable 'omit' from source: magic vars 28983 1726883134.56370: variable 'omit' from source: magic vars 28983 1726883134.56407: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883134.56439: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883134.56457: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883134.56476: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883134.56486: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883134.56517: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883134.56520: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883134.56524: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883134.56605: Set connection var ansible_connection to ssh 28983 1726883134.56617: Set connection var ansible_shell_executable to /bin/sh 28983 1726883134.56625: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883134.56636: Set connection var ansible_timeout to 10 28983 1726883134.56643: Set connection var ansible_pipelining to False 28983 1726883134.56646: Set connection var ansible_shell_type to sh 28983 1726883134.56664: variable 'ansible_shell_executable' from source: unknown 28983 1726883134.56667: variable 'ansible_connection' from source: unknown 28983 1726883134.56671: variable 'ansible_module_compression' from source: unknown 28983 1726883134.56676: variable 'ansible_shell_type' from source: unknown 28983 1726883134.56678: variable 'ansible_shell_executable' from source: unknown 28983 1726883134.56681: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883134.56687: variable 'ansible_pipelining' from source: unknown 28983 1726883134.56690: variable 'ansible_timeout' from source: unknown 28983 1726883134.56695: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883134.56810: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883134.56820: variable 'omit' from source: magic vars 28983 1726883134.56832: starting attempt loop 28983 1726883134.56837: running the handler 28983 1726883134.56878: variable '__network_connections_result' from source: set_fact 28983 1726883134.56945: variable '__network_connections_result' from source: set_fact 28983 1726883134.57039: handler run complete 28983 1726883134.57064: attempt loop complete, returning result 28983 1726883134.57068: _execute() done 28983 1726883134.57071: dumping result to json 28983 1726883134.57076: done dumping result, returning 28983 1726883134.57085: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affe814-3a2d-b16d-c0a7-0000000026a7] 28983 1726883134.57090: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000026a7 28983 1726883134.57193: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000026a7 28983 1726883134.57196: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": false, "failed": false, "stderr": "[002] #0, state:down persistent_state:absent, 'statebr': no connection matches 'statebr' to delete\n", "stderr_lines": [ "[002] #0, state:down persistent_state:absent, 'statebr': no connection matches 'statebr' to delete" ] } } 28983 1726883134.57302: no more pending results, returning what we have 28983 1726883134.57306: results queue empty 28983 1726883134.57307: checking for any_errors_fatal 28983 1726883134.57311: done checking for any_errors_fatal 28983 1726883134.57312: checking for max_fail_percentage 28983 1726883134.57314: done checking for max_fail_percentage 28983 1726883134.57315: checking to see if all hosts have failed and the running result is not ok 28983 1726883134.57316: done checking to see if all hosts have failed 28983 1726883134.57317: getting the remaining hosts for this loop 28983 1726883134.57318: done getting the remaining hosts for this loop 28983 1726883134.57322: getting the next task for host managed_node2 28983 1726883134.57329: done getting next task for host managed_node2 28983 1726883134.57333: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 28983 1726883134.57346: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883134.57360: getting variables 28983 1726883134.57361: in VariableManager get_vars() 28983 1726883134.57398: Calling all_inventory to load vars for managed_node2 28983 1726883134.57400: Calling groups_inventory to load vars for managed_node2 28983 1726883134.57401: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883134.57411: Calling all_plugins_play to load vars for managed_node2 28983 1726883134.57414: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883134.57416: Calling groups_plugins_play to load vars for managed_node2 28983 1726883134.58796: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883134.60396: done with get_vars() 28983 1726883134.60417: done getting variables 28983 1726883134.60463: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:45:34 -0400 (0:00:00.052) 0:02:44.602 ****** 28983 1726883134.60494: entering _queue_task() for managed_node2/debug 28983 1726883134.60716: worker is 1 (out of 1 available) 28983 1726883134.60730: exiting _queue_task() for managed_node2/debug 28983 1726883134.60744: done queuing things up, now waiting for results queue to drain 28983 1726883134.60746: waiting for pending results... 28983 1726883134.60931: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 28983 1726883134.61062: in run() - task 0affe814-3a2d-b16d-c0a7-0000000026a8 28983 1726883134.61084: variable 'ansible_search_path' from source: unknown 28983 1726883134.61090: variable 'ansible_search_path' from source: unknown 28983 1726883134.61115: calling self._execute() 28983 1726883134.61208: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883134.61216: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883134.61227: variable 'omit' from source: magic vars 28983 1726883134.61560: variable 'ansible_distribution_major_version' from source: facts 28983 1726883134.61571: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883134.61692: variable 'network_state' from source: role '' defaults 28983 1726883134.61701: Evaluated conditional (network_state != {}): False 28983 1726883134.61704: when evaluation is False, skipping this task 28983 1726883134.61707: _execute() done 28983 1726883134.61712: dumping result to json 28983 1726883134.61717: done dumping result, returning 28983 1726883134.61725: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affe814-3a2d-b16d-c0a7-0000000026a8] 28983 1726883134.61732: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000026a8 28983 1726883134.61828: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000026a8 28983 1726883134.61831: WORKER PROCESS EXITING skipping: [managed_node2] => { "false_condition": "network_state != {}" } 28983 1726883134.61898: no more pending results, returning what we have 28983 1726883134.61902: results queue empty 28983 1726883134.61903: checking for any_errors_fatal 28983 1726883134.61911: done checking for any_errors_fatal 28983 1726883134.61912: checking for max_fail_percentage 28983 1726883134.61914: done checking for max_fail_percentage 28983 1726883134.61915: checking to see if all hosts have failed and the running result is not ok 28983 1726883134.61916: done checking to see if all hosts have failed 28983 1726883134.61917: getting the remaining hosts for this loop 28983 1726883134.61918: done getting the remaining hosts for this loop 28983 1726883134.61922: getting the next task for host managed_node2 28983 1726883134.61930: done getting next task for host managed_node2 28983 1726883134.61936: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 28983 1726883134.61942: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883134.61969: getting variables 28983 1726883134.61971: in VariableManager get_vars() 28983 1726883134.62014: Calling all_inventory to load vars for managed_node2 28983 1726883134.62016: Calling groups_inventory to load vars for managed_node2 28983 1726883134.62018: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883134.62024: Calling all_plugins_play to load vars for managed_node2 28983 1726883134.62026: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883134.62029: Calling groups_plugins_play to load vars for managed_node2 28983 1726883134.63221: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883134.64929: done with get_vars() 28983 1726883134.64953: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:45:34 -0400 (0:00:00.045) 0:02:44.648 ****** 28983 1726883134.65026: entering _queue_task() for managed_node2/ping 28983 1726883134.65249: worker is 1 (out of 1 available) 28983 1726883134.65263: exiting _queue_task() for managed_node2/ping 28983 1726883134.65275: done queuing things up, now waiting for results queue to drain 28983 1726883134.65277: waiting for pending results... 28983 1726883134.65469: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 28983 1726883134.65595: in run() - task 0affe814-3a2d-b16d-c0a7-0000000026a9 28983 1726883134.65609: variable 'ansible_search_path' from source: unknown 28983 1726883134.65613: variable 'ansible_search_path' from source: unknown 28983 1726883134.65646: calling self._execute() 28983 1726883134.65727: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883134.65731: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883134.65747: variable 'omit' from source: magic vars 28983 1726883134.66063: variable 'ansible_distribution_major_version' from source: facts 28983 1726883134.66077: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883134.66086: variable 'omit' from source: magic vars 28983 1726883134.66141: variable 'omit' from source: magic vars 28983 1726883134.66171: variable 'omit' from source: magic vars 28983 1726883134.66212: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883134.66244: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883134.66262: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883134.66280: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883134.66294: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883134.66319: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883134.66323: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883134.66329: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883134.66412: Set connection var ansible_connection to ssh 28983 1726883134.66423: Set connection var ansible_shell_executable to /bin/sh 28983 1726883134.66432: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883134.66442: Set connection var ansible_timeout to 10 28983 1726883134.66449: Set connection var ansible_pipelining to False 28983 1726883134.66452: Set connection var ansible_shell_type to sh 28983 1726883134.66472: variable 'ansible_shell_executable' from source: unknown 28983 1726883134.66478: variable 'ansible_connection' from source: unknown 28983 1726883134.66482: variable 'ansible_module_compression' from source: unknown 28983 1726883134.66485: variable 'ansible_shell_type' from source: unknown 28983 1726883134.66490: variable 'ansible_shell_executable' from source: unknown 28983 1726883134.66493: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883134.66498: variable 'ansible_pipelining' from source: unknown 28983 1726883134.66503: variable 'ansible_timeout' from source: unknown 28983 1726883134.66513: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883134.66681: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28983 1726883134.66693: variable 'omit' from source: magic vars 28983 1726883134.66699: starting attempt loop 28983 1726883134.66701: running the handler 28983 1726883134.66716: _low_level_execute_command(): starting 28983 1726883134.66732: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726883134.67276: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883134.67280: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883134.67285: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726883134.67287: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883134.67341: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883134.67344: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883134.67350: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883134.67425: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883134.69208: stdout chunk (state=3): >>>/root <<< 28983 1726883134.69318: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883134.69369: stderr chunk (state=3): >>><<< 28983 1726883134.69376: stdout chunk (state=3): >>><<< 28983 1726883134.69394: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883134.69407: _low_level_execute_command(): starting 28983 1726883134.69413: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726883134.6939428-34726-175689350156236 `" && echo ansible-tmp-1726883134.6939428-34726-175689350156236="` echo /root/.ansible/tmp/ansible-tmp-1726883134.6939428-34726-175689350156236 `" ) && sleep 0' 28983 1726883134.69865: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883134.69869: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883134.69871: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration <<< 28983 1726883134.69880: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found <<< 28983 1726883134.69885: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883134.69931: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883134.69936: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883134.70014: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883134.72043: stdout chunk (state=3): >>>ansible-tmp-1726883134.6939428-34726-175689350156236=/root/.ansible/tmp/ansible-tmp-1726883134.6939428-34726-175689350156236 <<< 28983 1726883134.72166: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883134.72209: stderr chunk (state=3): >>><<< 28983 1726883134.72212: stdout chunk (state=3): >>><<< 28983 1726883134.72226: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726883134.6939428-34726-175689350156236=/root/.ansible/tmp/ansible-tmp-1726883134.6939428-34726-175689350156236 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883134.72263: variable 'ansible_module_compression' from source: unknown 28983 1726883134.72302: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 28983 1726883134.72331: variable 'ansible_facts' from source: unknown 28983 1726883134.72397: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726883134.6939428-34726-175689350156236/AnsiballZ_ping.py 28983 1726883134.72500: Sending initial data 28983 1726883134.72503: Sent initial data (153 bytes) 28983 1726883134.72953: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883134.72958: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883134.72961: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883134.72963: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883134.73018: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883134.73021: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883134.73097: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883134.74777: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 28983 1726883134.74786: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726883134.74847: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726883134.74921: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpj0oilgar /root/.ansible/tmp/ansible-tmp-1726883134.6939428-34726-175689350156236/AnsiballZ_ping.py <<< 28983 1726883134.74925: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726883134.6939428-34726-175689350156236/AnsiballZ_ping.py" <<< 28983 1726883134.74985: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpj0oilgar" to remote "/root/.ansible/tmp/ansible-tmp-1726883134.6939428-34726-175689350156236/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726883134.6939428-34726-175689350156236/AnsiballZ_ping.py" <<< 28983 1726883134.75862: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883134.75917: stderr chunk (state=3): >>><<< 28983 1726883134.75921: stdout chunk (state=3): >>><<< 28983 1726883134.75939: done transferring module to remote 28983 1726883134.75948: _low_level_execute_command(): starting 28983 1726883134.75953: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726883134.6939428-34726-175689350156236/ /root/.ansible/tmp/ansible-tmp-1726883134.6939428-34726-175689350156236/AnsiballZ_ping.py && sleep 0' 28983 1726883134.76385: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883134.76388: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726883134.76391: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address <<< 28983 1726883134.76393: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726883134.76396: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883134.76456: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883134.76462: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883134.76524: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883134.78459: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883134.78504: stderr chunk (state=3): >>><<< 28983 1726883134.78508: stdout chunk (state=3): >>><<< 28983 1726883134.78521: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883134.78524: _low_level_execute_command(): starting 28983 1726883134.78530: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726883134.6939428-34726-175689350156236/AnsiballZ_ping.py && sleep 0' 28983 1726883134.78945: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883134.78948: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883134.78951: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883134.78953: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883134.79007: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883134.79014: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883134.79091: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883134.96391: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 28983 1726883134.97812: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. <<< 28983 1726883134.97866: stderr chunk (state=3): >>><<< 28983 1726883134.97869: stdout chunk (state=3): >>><<< 28983 1726883134.97887: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726883134.97909: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726883134.6939428-34726-175689350156236/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726883134.97918: _low_level_execute_command(): starting 28983 1726883134.97923: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726883134.6939428-34726-175689350156236/ > /dev/null 2>&1 && sleep 0' 28983 1726883134.98378: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883134.98382: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726883134.98384: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address <<< 28983 1726883134.98387: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883134.98444: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883134.98447: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883134.98523: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883135.00493: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883135.00539: stderr chunk (state=3): >>><<< 28983 1726883135.00546: stdout chunk (state=3): >>><<< 28983 1726883135.00557: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883135.00566: handler run complete 28983 1726883135.00583: attempt loop complete, returning result 28983 1726883135.00586: _execute() done 28983 1726883135.00588: dumping result to json 28983 1726883135.00594: done dumping result, returning 28983 1726883135.00602: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affe814-3a2d-b16d-c0a7-0000000026a9] 28983 1726883135.00607: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000026a9 28983 1726883135.00701: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000026a9 28983 1726883135.00704: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "ping": "pong" } 28983 1726883135.00793: no more pending results, returning what we have 28983 1726883135.00798: results queue empty 28983 1726883135.00799: checking for any_errors_fatal 28983 1726883135.00807: done checking for any_errors_fatal 28983 1726883135.00808: checking for max_fail_percentage 28983 1726883135.00810: done checking for max_fail_percentage 28983 1726883135.00812: checking to see if all hosts have failed and the running result is not ok 28983 1726883135.00813: done checking to see if all hosts have failed 28983 1726883135.00821: getting the remaining hosts for this loop 28983 1726883135.00823: done getting the remaining hosts for this loop 28983 1726883135.00828: getting the next task for host managed_node2 28983 1726883135.00842: done getting next task for host managed_node2 28983 1726883135.00844: ^ task is: TASK: meta (role_complete) 28983 1726883135.00850: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883135.00866: getting variables 28983 1726883135.00867: in VariableManager get_vars() 28983 1726883135.00918: Calling all_inventory to load vars for managed_node2 28983 1726883135.00922: Calling groups_inventory to load vars for managed_node2 28983 1726883135.00931: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883135.00942: Calling all_plugins_play to load vars for managed_node2 28983 1726883135.00945: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883135.00948: Calling groups_plugins_play to load vars for managed_node2 28983 1726883135.02879: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883135.04589: done with get_vars() 28983 1726883135.04616: done getting variables 28983 1726883135.04688: done queuing things up, now waiting for results queue to drain 28983 1726883135.04690: results queue empty 28983 1726883135.04691: checking for any_errors_fatal 28983 1726883135.04693: done checking for any_errors_fatal 28983 1726883135.04693: checking for max_fail_percentage 28983 1726883135.04694: done checking for max_fail_percentage 28983 1726883135.04695: checking to see if all hosts have failed and the running result is not ok 28983 1726883135.04695: done checking to see if all hosts have failed 28983 1726883135.04696: getting the remaining hosts for this loop 28983 1726883135.04697: done getting the remaining hosts for this loop 28983 1726883135.04698: getting the next task for host managed_node2 28983 1726883135.04703: done getting next task for host managed_node2 28983 1726883135.04705: ^ task is: TASK: Asserts 28983 1726883135.04707: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883135.04709: getting variables 28983 1726883135.04710: in VariableManager get_vars() 28983 1726883135.04720: Calling all_inventory to load vars for managed_node2 28983 1726883135.04722: Calling groups_inventory to load vars for managed_node2 28983 1726883135.04724: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883135.04727: Calling all_plugins_play to load vars for managed_node2 28983 1726883135.04729: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883135.04731: Calling groups_plugins_play to load vars for managed_node2 28983 1726883135.05914: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883135.07504: done with get_vars() 28983 1726883135.07524: done getting variables TASK [Asserts] ***************************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:36 Friday 20 September 2024 21:45:35 -0400 (0:00:00.425) 0:02:45.073 ****** 28983 1726883135.07587: entering _queue_task() for managed_node2/include_tasks 28983 1726883135.07907: worker is 1 (out of 1 available) 28983 1726883135.07923: exiting _queue_task() for managed_node2/include_tasks 28983 1726883135.07939: done queuing things up, now waiting for results queue to drain 28983 1726883135.07941: waiting for pending results... 28983 1726883135.08135: running TaskExecutor() for managed_node2/TASK: Asserts 28983 1726883135.08237: in run() - task 0affe814-3a2d-b16d-c0a7-0000000020b2 28983 1726883135.08251: variable 'ansible_search_path' from source: unknown 28983 1726883135.08256: variable 'ansible_search_path' from source: unknown 28983 1726883135.08300: variable 'lsr_assert' from source: include params 28983 1726883135.08492: variable 'lsr_assert' from source: include params 28983 1726883135.08552: variable 'omit' from source: magic vars 28983 1726883135.08666: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883135.08678: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883135.08690: variable 'omit' from source: magic vars 28983 1726883135.08903: variable 'ansible_distribution_major_version' from source: facts 28983 1726883135.08911: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883135.08918: variable 'item' from source: unknown 28983 1726883135.08980: variable 'item' from source: unknown 28983 1726883135.09006: variable 'item' from source: unknown 28983 1726883135.09063: variable 'item' from source: unknown 28983 1726883135.09204: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883135.09208: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883135.09211: variable 'omit' from source: magic vars 28983 1726883135.09333: variable 'ansible_distribution_major_version' from source: facts 28983 1726883135.09338: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883135.09341: variable 'item' from source: unknown 28983 1726883135.09393: variable 'item' from source: unknown 28983 1726883135.09417: variable 'item' from source: unknown 28983 1726883135.09473: variable 'item' from source: unknown 28983 1726883135.09544: dumping result to json 28983 1726883135.09550: done dumping result, returning 28983 1726883135.09553: done running TaskExecutor() for managed_node2/TASK: Asserts [0affe814-3a2d-b16d-c0a7-0000000020b2] 28983 1726883135.09556: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000020b2 28983 1726883135.09595: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000020b2 28983 1726883135.09598: WORKER PROCESS EXITING 28983 1726883135.09631: no more pending results, returning what we have 28983 1726883135.09639: in VariableManager get_vars() 28983 1726883135.09696: Calling all_inventory to load vars for managed_node2 28983 1726883135.09699: Calling groups_inventory to load vars for managed_node2 28983 1726883135.09703: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883135.09717: Calling all_plugins_play to load vars for managed_node2 28983 1726883135.09721: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883135.09724: Calling groups_plugins_play to load vars for managed_node2 28983 1726883135.10978: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883135.12648: done with get_vars() 28983 1726883135.12667: variable 'ansible_search_path' from source: unknown 28983 1726883135.12668: variable 'ansible_search_path' from source: unknown 28983 1726883135.12701: variable 'ansible_search_path' from source: unknown 28983 1726883135.12702: variable 'ansible_search_path' from source: unknown 28983 1726883135.12723: we have included files to process 28983 1726883135.12724: generating all_blocks data 28983 1726883135.12726: done generating all_blocks data 28983 1726883135.12730: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 28983 1726883135.12731: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 28983 1726883135.12732: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 28983 1726883135.12819: in VariableManager get_vars() 28983 1726883135.12837: done with get_vars() 28983 1726883135.12926: done processing included file 28983 1726883135.12928: iterating over new_blocks loaded from include file 28983 1726883135.12929: in VariableManager get_vars() 28983 1726883135.12943: done with get_vars() 28983 1726883135.12945: filtering new block on tags 28983 1726883135.12972: done filtering new block on tags 28983 1726883135.12974: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml for managed_node2 => (item=tasks/assert_profile_absent.yml) 28983 1726883135.12978: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_NetworkManager_NVR.yml 28983 1726883135.12979: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_NetworkManager_NVR.yml 28983 1726883135.12981: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_NetworkManager_NVR.yml 28983 1726883135.13282: done processing included file 28983 1726883135.13284: iterating over new_blocks loaded from include file 28983 1726883135.13285: in VariableManager get_vars() 28983 1726883135.13297: done with get_vars() 28983 1726883135.13298: filtering new block on tags 28983 1726883135.13337: done filtering new block on tags 28983 1726883135.13339: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_NetworkManager_NVR.yml for managed_node2 => (item=tasks/get_NetworkManager_NVR.yml) 28983 1726883135.13342: extending task lists for all hosts with included blocks 28983 1726883135.14226: done extending task lists 28983 1726883135.14228: done processing included files 28983 1726883135.14228: results queue empty 28983 1726883135.14229: checking for any_errors_fatal 28983 1726883135.14230: done checking for any_errors_fatal 28983 1726883135.14231: checking for max_fail_percentage 28983 1726883135.14231: done checking for max_fail_percentage 28983 1726883135.14232: checking to see if all hosts have failed and the running result is not ok 28983 1726883135.14233: done checking to see if all hosts have failed 28983 1726883135.14233: getting the remaining hosts for this loop 28983 1726883135.14236: done getting the remaining hosts for this loop 28983 1726883135.14238: getting the next task for host managed_node2 28983 1726883135.14241: done getting next task for host managed_node2 28983 1726883135.14243: ^ task is: TASK: Include the task 'get_profile_stat.yml' 28983 1726883135.14245: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883135.14247: getting variables 28983 1726883135.14251: in VariableManager get_vars() 28983 1726883135.14260: Calling all_inventory to load vars for managed_node2 28983 1726883135.14262: Calling groups_inventory to load vars for managed_node2 28983 1726883135.14264: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883135.14267: Calling all_plugins_play to load vars for managed_node2 28983 1726883135.14269: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883135.14271: Calling groups_plugins_play to load vars for managed_node2 28983 1726883135.15330: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883135.16899: done with get_vars() 28983 1726883135.16923: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:3 Friday 20 September 2024 21:45:35 -0400 (0:00:00.093) 0:02:45.167 ****** 28983 1726883135.16980: entering _queue_task() for managed_node2/include_tasks 28983 1726883135.17213: worker is 1 (out of 1 available) 28983 1726883135.17227: exiting _queue_task() for managed_node2/include_tasks 28983 1726883135.17242: done queuing things up, now waiting for results queue to drain 28983 1726883135.17244: waiting for pending results... 28983 1726883135.17433: running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' 28983 1726883135.17535: in run() - task 0affe814-3a2d-b16d-c0a7-000000002804 28983 1726883135.17550: variable 'ansible_search_path' from source: unknown 28983 1726883135.17555: variable 'ansible_search_path' from source: unknown 28983 1726883135.17590: calling self._execute() 28983 1726883135.17675: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883135.17679: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883135.17707: variable 'omit' from source: magic vars 28983 1726883135.18018: variable 'ansible_distribution_major_version' from source: facts 28983 1726883135.18039: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883135.18045: _execute() done 28983 1726883135.18048: dumping result to json 28983 1726883135.18054: done dumping result, returning 28983 1726883135.18060: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' [0affe814-3a2d-b16d-c0a7-000000002804] 28983 1726883135.18065: sending task result for task 0affe814-3a2d-b16d-c0a7-000000002804 28983 1726883135.18159: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000002804 28983 1726883135.18162: WORKER PROCESS EXITING 28983 1726883135.18195: no more pending results, returning what we have 28983 1726883135.18200: in VariableManager get_vars() 28983 1726883135.18258: Calling all_inventory to load vars for managed_node2 28983 1726883135.18262: Calling groups_inventory to load vars for managed_node2 28983 1726883135.18266: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883135.18279: Calling all_plugins_play to load vars for managed_node2 28983 1726883135.18283: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883135.18287: Calling groups_plugins_play to load vars for managed_node2 28983 1726883135.19588: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883135.21175: done with get_vars() 28983 1726883135.21194: variable 'ansible_search_path' from source: unknown 28983 1726883135.21195: variable 'ansible_search_path' from source: unknown 28983 1726883135.21201: variable 'item' from source: include params 28983 1726883135.21290: variable 'item' from source: include params 28983 1726883135.21316: we have included files to process 28983 1726883135.21317: generating all_blocks data 28983 1726883135.21318: done generating all_blocks data 28983 1726883135.21319: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 28983 1726883135.21320: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 28983 1726883135.21322: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 28983 1726883135.22053: done processing included file 28983 1726883135.22054: iterating over new_blocks loaded from include file 28983 1726883135.22055: in VariableManager get_vars() 28983 1726883135.22069: done with get_vars() 28983 1726883135.22071: filtering new block on tags 28983 1726883135.22125: done filtering new block on tags 28983 1726883135.22127: in VariableManager get_vars() 28983 1726883135.22142: done with get_vars() 28983 1726883135.22144: filtering new block on tags 28983 1726883135.22191: done filtering new block on tags 28983 1726883135.22193: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node2 28983 1726883135.22197: extending task lists for all hosts with included blocks 28983 1726883135.22381: done extending task lists 28983 1726883135.22382: done processing included files 28983 1726883135.22382: results queue empty 28983 1726883135.22383: checking for any_errors_fatal 28983 1726883135.22386: done checking for any_errors_fatal 28983 1726883135.22386: checking for max_fail_percentage 28983 1726883135.22387: done checking for max_fail_percentage 28983 1726883135.22388: checking to see if all hosts have failed and the running result is not ok 28983 1726883135.22388: done checking to see if all hosts have failed 28983 1726883135.22389: getting the remaining hosts for this loop 28983 1726883135.22390: done getting the remaining hosts for this loop 28983 1726883135.22392: getting the next task for host managed_node2 28983 1726883135.22395: done getting next task for host managed_node2 28983 1726883135.22397: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 28983 1726883135.22399: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883135.22401: getting variables 28983 1726883135.22402: in VariableManager get_vars() 28983 1726883135.22410: Calling all_inventory to load vars for managed_node2 28983 1726883135.22412: Calling groups_inventory to load vars for managed_node2 28983 1726883135.22413: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883135.22417: Calling all_plugins_play to load vars for managed_node2 28983 1726883135.22419: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883135.22421: Calling groups_plugins_play to load vars for managed_node2 28983 1726883135.23513: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883135.25072: done with get_vars() 28983 1726883135.25095: done getting variables 28983 1726883135.25125: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 21:45:35 -0400 (0:00:00.081) 0:02:45.249 ****** 28983 1726883135.25151: entering _queue_task() for managed_node2/set_fact 28983 1726883135.25386: worker is 1 (out of 1 available) 28983 1726883135.25399: exiting _queue_task() for managed_node2/set_fact 28983 1726883135.25413: done queuing things up, now waiting for results queue to drain 28983 1726883135.25415: waiting for pending results... 28983 1726883135.25612: running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag 28983 1726883135.25714: in run() - task 0affe814-3a2d-b16d-c0a7-000000002888 28983 1726883135.25727: variable 'ansible_search_path' from source: unknown 28983 1726883135.25731: variable 'ansible_search_path' from source: unknown 28983 1726883135.25766: calling self._execute() 28983 1726883135.25849: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883135.25856: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883135.25872: variable 'omit' from source: magic vars 28983 1726883135.26187: variable 'ansible_distribution_major_version' from source: facts 28983 1726883135.26198: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883135.26205: variable 'omit' from source: magic vars 28983 1726883135.26252: variable 'omit' from source: magic vars 28983 1726883135.26283: variable 'omit' from source: magic vars 28983 1726883135.26322: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883135.26356: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883135.26375: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883135.26393: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883135.26403: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883135.26436: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883135.26439: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883135.26444: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883135.26524: Set connection var ansible_connection to ssh 28983 1726883135.26539: Set connection var ansible_shell_executable to /bin/sh 28983 1726883135.26549: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883135.26557: Set connection var ansible_timeout to 10 28983 1726883135.26564: Set connection var ansible_pipelining to False 28983 1726883135.26567: Set connection var ansible_shell_type to sh 28983 1726883135.26588: variable 'ansible_shell_executable' from source: unknown 28983 1726883135.26592: variable 'ansible_connection' from source: unknown 28983 1726883135.26595: variable 'ansible_module_compression' from source: unknown 28983 1726883135.26597: variable 'ansible_shell_type' from source: unknown 28983 1726883135.26601: variable 'ansible_shell_executable' from source: unknown 28983 1726883135.26605: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883135.26610: variable 'ansible_pipelining' from source: unknown 28983 1726883135.26613: variable 'ansible_timeout' from source: unknown 28983 1726883135.26618: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883135.26736: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883135.26753: variable 'omit' from source: magic vars 28983 1726883135.26759: starting attempt loop 28983 1726883135.26762: running the handler 28983 1726883135.26778: handler run complete 28983 1726883135.26785: attempt loop complete, returning result 28983 1726883135.26788: _execute() done 28983 1726883135.26792: dumping result to json 28983 1726883135.26797: done dumping result, returning 28983 1726883135.26804: done running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag [0affe814-3a2d-b16d-c0a7-000000002888] 28983 1726883135.26810: sending task result for task 0affe814-3a2d-b16d-c0a7-000000002888 28983 1726883135.26901: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000002888 28983 1726883135.26904: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 28983 1726883135.26967: no more pending results, returning what we have 28983 1726883135.26970: results queue empty 28983 1726883135.26971: checking for any_errors_fatal 28983 1726883135.26976: done checking for any_errors_fatal 28983 1726883135.26977: checking for max_fail_percentage 28983 1726883135.26979: done checking for max_fail_percentage 28983 1726883135.26980: checking to see if all hosts have failed and the running result is not ok 28983 1726883135.26981: done checking to see if all hosts have failed 28983 1726883135.26982: getting the remaining hosts for this loop 28983 1726883135.26983: done getting the remaining hosts for this loop 28983 1726883135.26988: getting the next task for host managed_node2 28983 1726883135.26997: done getting next task for host managed_node2 28983 1726883135.27000: ^ task is: TASK: Stat profile file 28983 1726883135.27006: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883135.27010: getting variables 28983 1726883135.27011: in VariableManager get_vars() 28983 1726883135.27050: Calling all_inventory to load vars for managed_node2 28983 1726883135.27053: Calling groups_inventory to load vars for managed_node2 28983 1726883135.27056: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883135.27065: Calling all_plugins_play to load vars for managed_node2 28983 1726883135.27068: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883135.27074: Calling groups_plugins_play to load vars for managed_node2 28983 1726883135.28305: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883135.29918: done with get_vars() 28983 1726883135.29942: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 21:45:35 -0400 (0:00:00.048) 0:02:45.297 ****** 28983 1726883135.30014: entering _queue_task() for managed_node2/stat 28983 1726883135.30236: worker is 1 (out of 1 available) 28983 1726883135.30251: exiting _queue_task() for managed_node2/stat 28983 1726883135.30267: done queuing things up, now waiting for results queue to drain 28983 1726883135.30269: waiting for pending results... 28983 1726883135.30450: running TaskExecutor() for managed_node2/TASK: Stat profile file 28983 1726883135.30542: in run() - task 0affe814-3a2d-b16d-c0a7-000000002889 28983 1726883135.30555: variable 'ansible_search_path' from source: unknown 28983 1726883135.30559: variable 'ansible_search_path' from source: unknown 28983 1726883135.30590: calling self._execute() 28983 1726883135.30671: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883135.30677: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883135.30688: variable 'omit' from source: magic vars 28983 1726883135.31002: variable 'ansible_distribution_major_version' from source: facts 28983 1726883135.31012: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883135.31019: variable 'omit' from source: magic vars 28983 1726883135.31076: variable 'omit' from source: magic vars 28983 1726883135.31155: variable 'profile' from source: play vars 28983 1726883135.31161: variable 'interface' from source: play vars 28983 1726883135.31216: variable 'interface' from source: play vars 28983 1726883135.31235: variable 'omit' from source: magic vars 28983 1726883135.31276: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883135.31307: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883135.31325: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883135.31343: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883135.31353: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883135.31384: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883135.31390: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883135.31392: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883135.31474: Set connection var ansible_connection to ssh 28983 1726883135.31483: Set connection var ansible_shell_executable to /bin/sh 28983 1726883135.31492: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883135.31503: Set connection var ansible_timeout to 10 28983 1726883135.31510: Set connection var ansible_pipelining to False 28983 1726883135.31513: Set connection var ansible_shell_type to sh 28983 1726883135.31532: variable 'ansible_shell_executable' from source: unknown 28983 1726883135.31537: variable 'ansible_connection' from source: unknown 28983 1726883135.31540: variable 'ansible_module_compression' from source: unknown 28983 1726883135.31544: variable 'ansible_shell_type' from source: unknown 28983 1726883135.31548: variable 'ansible_shell_executable' from source: unknown 28983 1726883135.31552: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883135.31557: variable 'ansible_pipelining' from source: unknown 28983 1726883135.31560: variable 'ansible_timeout' from source: unknown 28983 1726883135.31565: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883135.31735: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28983 1726883135.31746: variable 'omit' from source: magic vars 28983 1726883135.31752: starting attempt loop 28983 1726883135.31755: running the handler 28983 1726883135.31768: _low_level_execute_command(): starting 28983 1726883135.31778: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726883135.32330: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883135.32336: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883135.32339: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883135.32342: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883135.32403: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883135.32408: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883135.32411: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883135.32482: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883135.34262: stdout chunk (state=3): >>>/root <<< 28983 1726883135.34370: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883135.34420: stderr chunk (state=3): >>><<< 28983 1726883135.34424: stdout chunk (state=3): >>><<< 28983 1726883135.34449: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883135.34463: _low_level_execute_command(): starting 28983 1726883135.34468: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726883135.3444774-34740-156326230171089 `" && echo ansible-tmp-1726883135.3444774-34740-156326230171089="` echo /root/.ansible/tmp/ansible-tmp-1726883135.3444774-34740-156326230171089 `" ) && sleep 0' 28983 1726883135.34928: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883135.34931: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883135.34943: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883135.34945: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883135.34992: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883135.35002: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883135.35074: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883135.37115: stdout chunk (state=3): >>>ansible-tmp-1726883135.3444774-34740-156326230171089=/root/.ansible/tmp/ansible-tmp-1726883135.3444774-34740-156326230171089 <<< 28983 1726883135.37231: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883135.37281: stderr chunk (state=3): >>><<< 28983 1726883135.37285: stdout chunk (state=3): >>><<< 28983 1726883135.37299: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726883135.3444774-34740-156326230171089=/root/.ansible/tmp/ansible-tmp-1726883135.3444774-34740-156326230171089 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883135.37337: variable 'ansible_module_compression' from source: unknown 28983 1726883135.37390: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 28983 1726883135.37423: variable 'ansible_facts' from source: unknown 28983 1726883135.37490: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726883135.3444774-34740-156326230171089/AnsiballZ_stat.py 28983 1726883135.37592: Sending initial data 28983 1726883135.37596: Sent initial data (153 bytes) 28983 1726883135.38043: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883135.38047: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883135.38050: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration <<< 28983 1726883135.38053: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found <<< 28983 1726883135.38056: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883135.38133: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883135.38139: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883135.38223: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883135.39861: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 28983 1726883135.39870: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726883135.39938: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726883135.40010: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmp1k1bxsvr /root/.ansible/tmp/ansible-tmp-1726883135.3444774-34740-156326230171089/AnsiballZ_stat.py <<< 28983 1726883135.40014: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726883135.3444774-34740-156326230171089/AnsiballZ_stat.py" <<< 28983 1726883135.40074: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmp1k1bxsvr" to remote "/root/.ansible/tmp/ansible-tmp-1726883135.3444774-34740-156326230171089/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726883135.3444774-34740-156326230171089/AnsiballZ_stat.py" <<< 28983 1726883135.40968: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883135.41028: stderr chunk (state=3): >>><<< 28983 1726883135.41031: stdout chunk (state=3): >>><<< 28983 1726883135.41051: done transferring module to remote 28983 1726883135.41059: _low_level_execute_command(): starting 28983 1726883135.41064: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726883135.3444774-34740-156326230171089/ /root/.ansible/tmp/ansible-tmp-1726883135.3444774-34740-156326230171089/AnsiballZ_stat.py && sleep 0' 28983 1726883135.41493: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883135.41496: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883135.41499: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883135.41501: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883135.41565: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883135.41567: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883135.41631: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883135.43570: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883135.43617: stderr chunk (state=3): >>><<< 28983 1726883135.43621: stdout chunk (state=3): >>><<< 28983 1726883135.43633: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883135.43639: _low_level_execute_command(): starting 28983 1726883135.43644: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726883135.3444774-34740-156326230171089/AnsiballZ_stat.py && sleep 0' 28983 1726883135.44066: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883135.44069: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883135.44071: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883135.44074: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883135.44128: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883135.44135: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883135.44207: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883135.61388: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 28983 1726883135.62783: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. <<< 28983 1726883135.62842: stderr chunk (state=3): >>><<< 28983 1726883135.62846: stdout chunk (state=3): >>><<< 28983 1726883135.62860: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726883135.62893: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726883135.3444774-34740-156326230171089/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726883135.62902: _low_level_execute_command(): starting 28983 1726883135.62910: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726883135.3444774-34740-156326230171089/ > /dev/null 2>&1 && sleep 0' 28983 1726883135.63386: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883135.63390: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883135.63393: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883135.63395: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883135.63449: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883135.63456: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883135.63526: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883135.65466: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883135.65510: stderr chunk (state=3): >>><<< 28983 1726883135.65513: stdout chunk (state=3): >>><<< 28983 1726883135.65530: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883135.65534: handler run complete 28983 1726883135.65561: attempt loop complete, returning result 28983 1726883135.65564: _execute() done 28983 1726883135.65566: dumping result to json 28983 1726883135.65572: done dumping result, returning 28983 1726883135.65582: done running TaskExecutor() for managed_node2/TASK: Stat profile file [0affe814-3a2d-b16d-c0a7-000000002889] 28983 1726883135.65587: sending task result for task 0affe814-3a2d-b16d-c0a7-000000002889 28983 1726883135.65692: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000002889 28983 1726883135.65695: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 28983 1726883135.65764: no more pending results, returning what we have 28983 1726883135.65769: results queue empty 28983 1726883135.65770: checking for any_errors_fatal 28983 1726883135.65782: done checking for any_errors_fatal 28983 1726883135.65783: checking for max_fail_percentage 28983 1726883135.65786: done checking for max_fail_percentage 28983 1726883135.65787: checking to see if all hosts have failed and the running result is not ok 28983 1726883135.65788: done checking to see if all hosts have failed 28983 1726883135.65789: getting the remaining hosts for this loop 28983 1726883135.65791: done getting the remaining hosts for this loop 28983 1726883135.65796: getting the next task for host managed_node2 28983 1726883135.65806: done getting next task for host managed_node2 28983 1726883135.65809: ^ task is: TASK: Set NM profile exist flag based on the profile files 28983 1726883135.65815: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883135.65820: getting variables 28983 1726883135.65822: in VariableManager get_vars() 28983 1726883135.65875: Calling all_inventory to load vars for managed_node2 28983 1726883135.65878: Calling groups_inventory to load vars for managed_node2 28983 1726883135.65882: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883135.65893: Calling all_plugins_play to load vars for managed_node2 28983 1726883135.65896: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883135.65900: Calling groups_plugins_play to load vars for managed_node2 28983 1726883135.67370: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883135.68969: done with get_vars() 28983 1726883135.68998: done getting variables 28983 1726883135.69049: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 21:45:35 -0400 (0:00:00.390) 0:02:45.688 ****** 28983 1726883135.69079: entering _queue_task() for managed_node2/set_fact 28983 1726883135.69338: worker is 1 (out of 1 available) 28983 1726883135.69353: exiting _queue_task() for managed_node2/set_fact 28983 1726883135.69368: done queuing things up, now waiting for results queue to drain 28983 1726883135.69370: waiting for pending results... 28983 1726883135.69586: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files 28983 1726883135.69683: in run() - task 0affe814-3a2d-b16d-c0a7-00000000288a 28983 1726883135.69697: variable 'ansible_search_path' from source: unknown 28983 1726883135.69702: variable 'ansible_search_path' from source: unknown 28983 1726883135.69733: calling self._execute() 28983 1726883135.69822: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883135.69828: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883135.69841: variable 'omit' from source: magic vars 28983 1726883135.70174: variable 'ansible_distribution_major_version' from source: facts 28983 1726883135.70183: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883135.70292: variable 'profile_stat' from source: set_fact 28983 1726883135.70302: Evaluated conditional (profile_stat.stat.exists): False 28983 1726883135.70306: when evaluation is False, skipping this task 28983 1726883135.70308: _execute() done 28983 1726883135.70313: dumping result to json 28983 1726883135.70318: done dumping result, returning 28983 1726883135.70331: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files [0affe814-3a2d-b16d-c0a7-00000000288a] 28983 1726883135.70335: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000288a 28983 1726883135.70427: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000288a 28983 1726883135.70433: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 28983 1726883135.70514: no more pending results, returning what we have 28983 1726883135.70518: results queue empty 28983 1726883135.70519: checking for any_errors_fatal 28983 1726883135.70527: done checking for any_errors_fatal 28983 1726883135.70528: checking for max_fail_percentage 28983 1726883135.70530: done checking for max_fail_percentage 28983 1726883135.70531: checking to see if all hosts have failed and the running result is not ok 28983 1726883135.70532: done checking to see if all hosts have failed 28983 1726883135.70533: getting the remaining hosts for this loop 28983 1726883135.70536: done getting the remaining hosts for this loop 28983 1726883135.70541: getting the next task for host managed_node2 28983 1726883135.70550: done getting next task for host managed_node2 28983 1726883135.70552: ^ task is: TASK: Get NM profile info 28983 1726883135.70558: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883135.70562: getting variables 28983 1726883135.70564: in VariableManager get_vars() 28983 1726883135.70605: Calling all_inventory to load vars for managed_node2 28983 1726883135.70608: Calling groups_inventory to load vars for managed_node2 28983 1726883135.70611: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883135.70624: Calling all_plugins_play to load vars for managed_node2 28983 1726883135.70628: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883135.70631: Calling groups_plugins_play to load vars for managed_node2 28983 1726883135.71970: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883135.73595: done with get_vars() 28983 1726883135.73618: done getting variables 28983 1726883135.73671: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 21:45:35 -0400 (0:00:00.046) 0:02:45.734 ****** 28983 1726883135.73701: entering _queue_task() for managed_node2/shell 28983 1726883135.73936: worker is 1 (out of 1 available) 28983 1726883135.73952: exiting _queue_task() for managed_node2/shell 28983 1726883135.73965: done queuing things up, now waiting for results queue to drain 28983 1726883135.73967: waiting for pending results... 28983 1726883135.74169: running TaskExecutor() for managed_node2/TASK: Get NM profile info 28983 1726883135.74272: in run() - task 0affe814-3a2d-b16d-c0a7-00000000288b 28983 1726883135.74289: variable 'ansible_search_path' from source: unknown 28983 1726883135.74293: variable 'ansible_search_path' from source: unknown 28983 1726883135.74325: calling self._execute() 28983 1726883135.74415: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883135.74421: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883135.74433: variable 'omit' from source: magic vars 28983 1726883135.74764: variable 'ansible_distribution_major_version' from source: facts 28983 1726883135.74775: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883135.74784: variable 'omit' from source: magic vars 28983 1726883135.74825: variable 'omit' from source: magic vars 28983 1726883135.74915: variable 'profile' from source: play vars 28983 1726883135.74920: variable 'interface' from source: play vars 28983 1726883135.74983: variable 'interface' from source: play vars 28983 1726883135.75000: variable 'omit' from source: magic vars 28983 1726883135.75042: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883135.75077: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883135.75100: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883135.75116: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883135.75126: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883135.75157: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883135.75160: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883135.75165: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883135.75250: Set connection var ansible_connection to ssh 28983 1726883135.75262: Set connection var ansible_shell_executable to /bin/sh 28983 1726883135.75271: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883135.75287: Set connection var ansible_timeout to 10 28983 1726883135.75290: Set connection var ansible_pipelining to False 28983 1726883135.75294: Set connection var ansible_shell_type to sh 28983 1726883135.75314: variable 'ansible_shell_executable' from source: unknown 28983 1726883135.75317: variable 'ansible_connection' from source: unknown 28983 1726883135.75319: variable 'ansible_module_compression' from source: unknown 28983 1726883135.75324: variable 'ansible_shell_type' from source: unknown 28983 1726883135.75327: variable 'ansible_shell_executable' from source: unknown 28983 1726883135.75332: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883135.75338: variable 'ansible_pipelining' from source: unknown 28983 1726883135.75341: variable 'ansible_timeout' from source: unknown 28983 1726883135.75347: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883135.75466: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883135.75480: variable 'omit' from source: magic vars 28983 1726883135.75486: starting attempt loop 28983 1726883135.75489: running the handler 28983 1726883135.75507: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883135.75521: _low_level_execute_command(): starting 28983 1726883135.75529: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726883135.76070: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883135.76076: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883135.76080: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883135.76138: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883135.76142: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883135.76226: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883135.78005: stdout chunk (state=3): >>>/root <<< 28983 1726883135.78117: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883135.78173: stderr chunk (state=3): >>><<< 28983 1726883135.78179: stdout chunk (state=3): >>><<< 28983 1726883135.78202: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883135.78214: _low_level_execute_command(): starting 28983 1726883135.78220: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726883135.782024-34749-135432430870556 `" && echo ansible-tmp-1726883135.782024-34749-135432430870556="` echo /root/.ansible/tmp/ansible-tmp-1726883135.782024-34749-135432430870556 `" ) && sleep 0' 28983 1726883135.78692: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883135.78702: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883135.78705: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration <<< 28983 1726883135.78708: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found <<< 28983 1726883135.78711: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883135.78754: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883135.78757: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883135.78837: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883135.80879: stdout chunk (state=3): >>>ansible-tmp-1726883135.782024-34749-135432430870556=/root/.ansible/tmp/ansible-tmp-1726883135.782024-34749-135432430870556 <<< 28983 1726883135.80994: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883135.81041: stderr chunk (state=3): >>><<< 28983 1726883135.81044: stdout chunk (state=3): >>><<< 28983 1726883135.81059: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726883135.782024-34749-135432430870556=/root/.ansible/tmp/ansible-tmp-1726883135.782024-34749-135432430870556 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883135.81111: variable 'ansible_module_compression' from source: unknown 28983 1726883135.81138: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 28983 1726883135.81171: variable 'ansible_facts' from source: unknown 28983 1726883135.81241: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726883135.782024-34749-135432430870556/AnsiballZ_command.py 28983 1726883135.81354: Sending initial data 28983 1726883135.81358: Sent initial data (155 bytes) 28983 1726883135.81810: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883135.81814: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883135.81817: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883135.81819: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883135.81822: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883135.81884: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883135.81887: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883135.81954: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883135.83625: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726883135.83690: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726883135.83758: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmph7925yse /root/.ansible/tmp/ansible-tmp-1726883135.782024-34749-135432430870556/AnsiballZ_command.py <<< 28983 1726883135.83767: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726883135.782024-34749-135432430870556/AnsiballZ_command.py" <<< 28983 1726883135.83827: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmph7925yse" to remote "/root/.ansible/tmp/ansible-tmp-1726883135.782024-34749-135432430870556/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726883135.782024-34749-135432430870556/AnsiballZ_command.py" <<< 28983 1726883135.84725: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883135.84782: stderr chunk (state=3): >>><<< 28983 1726883135.84785: stdout chunk (state=3): >>><<< 28983 1726883135.84808: done transferring module to remote 28983 1726883135.84816: _low_level_execute_command(): starting 28983 1726883135.84822: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726883135.782024-34749-135432430870556/ /root/.ansible/tmp/ansible-tmp-1726883135.782024-34749-135432430870556/AnsiballZ_command.py && sleep 0' 28983 1726883135.85260: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883135.85263: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883135.85266: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883135.85271: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883135.85324: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883135.85331: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883135.85400: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883135.87258: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883135.87303: stderr chunk (state=3): >>><<< 28983 1726883135.87306: stdout chunk (state=3): >>><<< 28983 1726883135.87323: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883135.87326: _low_level_execute_command(): starting 28983 1726883135.87329: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726883135.782024-34749-135432430870556/AnsiballZ_command.py && sleep 0' 28983 1726883135.87755: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883135.87758: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883135.87761: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883135.87763: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883135.87816: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883135.87820: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883135.87897: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883136.06749: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "start": "2024-09-20 21:45:36.048777", "end": "2024-09-20 21:45:36.066283", "delta": "0:00:00.017506", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 28983 1726883136.08277: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.46.139 closed. <<< 28983 1726883136.08339: stderr chunk (state=3): >>><<< 28983 1726883136.08343: stdout chunk (state=3): >>><<< 28983 1726883136.08363: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "start": "2024-09-20 21:45:36.048777", "end": "2024-09-20 21:45:36.066283", "delta": "0:00:00.017506", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.46.139 closed. 28983 1726883136.08403: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726883135.782024-34749-135432430870556/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726883136.08413: _low_level_execute_command(): starting 28983 1726883136.08419: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726883135.782024-34749-135432430870556/ > /dev/null 2>&1 && sleep 0' 28983 1726883136.08899: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883136.08902: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883136.08910: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration <<< 28983 1726883136.08912: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883136.08915: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883136.08965: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883136.08972: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883136.09042: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883136.10964: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883136.11011: stderr chunk (state=3): >>><<< 28983 1726883136.11014: stdout chunk (state=3): >>><<< 28983 1726883136.11029: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883136.11041: handler run complete 28983 1726883136.11061: Evaluated conditional (False): False 28983 1726883136.11076: attempt loop complete, returning result 28983 1726883136.11079: _execute() done 28983 1726883136.11081: dumping result to json 28983 1726883136.11087: done dumping result, returning 28983 1726883136.11094: done running TaskExecutor() for managed_node2/TASK: Get NM profile info [0affe814-3a2d-b16d-c0a7-00000000288b] 28983 1726883136.11099: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000288b 28983 1726883136.11211: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000288b 28983 1726883136.11215: WORKER PROCESS EXITING fatal: [managed_node2]: FAILED! => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "delta": "0:00:00.017506", "end": "2024-09-20 21:45:36.066283", "rc": 1, "start": "2024-09-20 21:45:36.048777" } MSG: non-zero return code ...ignoring 28983 1726883136.11305: no more pending results, returning what we have 28983 1726883136.11310: results queue empty 28983 1726883136.11311: checking for any_errors_fatal 28983 1726883136.11318: done checking for any_errors_fatal 28983 1726883136.11319: checking for max_fail_percentage 28983 1726883136.11321: done checking for max_fail_percentage 28983 1726883136.11322: checking to see if all hosts have failed and the running result is not ok 28983 1726883136.11325: done checking to see if all hosts have failed 28983 1726883136.11326: getting the remaining hosts for this loop 28983 1726883136.11328: done getting the remaining hosts for this loop 28983 1726883136.11333: getting the next task for host managed_node2 28983 1726883136.11343: done getting next task for host managed_node2 28983 1726883136.11347: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 28983 1726883136.11352: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883136.11356: getting variables 28983 1726883136.11358: in VariableManager get_vars() 28983 1726883136.11409: Calling all_inventory to load vars for managed_node2 28983 1726883136.11412: Calling groups_inventory to load vars for managed_node2 28983 1726883136.11417: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883136.11427: Calling all_plugins_play to load vars for managed_node2 28983 1726883136.11430: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883136.11442: Calling groups_plugins_play to load vars for managed_node2 28983 1726883136.12768: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883136.14390: done with get_vars() 28983 1726883136.14414: done getting variables 28983 1726883136.14465: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 21:45:36 -0400 (0:00:00.407) 0:02:46.142 ****** 28983 1726883136.14497: entering _queue_task() for managed_node2/set_fact 28983 1726883136.14748: worker is 1 (out of 1 available) 28983 1726883136.14764: exiting _queue_task() for managed_node2/set_fact 28983 1726883136.14781: done queuing things up, now waiting for results queue to drain 28983 1726883136.14783: waiting for pending results... 28983 1726883136.14995: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 28983 1726883136.15094: in run() - task 0affe814-3a2d-b16d-c0a7-00000000288c 28983 1726883136.15109: variable 'ansible_search_path' from source: unknown 28983 1726883136.15115: variable 'ansible_search_path' from source: unknown 28983 1726883136.15147: calling self._execute() 28983 1726883136.15238: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883136.15244: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883136.15255: variable 'omit' from source: magic vars 28983 1726883136.15592: variable 'ansible_distribution_major_version' from source: facts 28983 1726883136.15604: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883136.15729: variable 'nm_profile_exists' from source: set_fact 28983 1726883136.15741: Evaluated conditional (nm_profile_exists.rc == 0): False 28983 1726883136.15744: when evaluation is False, skipping this task 28983 1726883136.15747: _execute() done 28983 1726883136.15754: dumping result to json 28983 1726883136.15756: done dumping result, returning 28983 1726883136.15765: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0affe814-3a2d-b16d-c0a7-00000000288c] 28983 1726883136.15774: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000288c 28983 1726883136.15866: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000288c 28983 1726883136.15870: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "nm_profile_exists.rc == 0", "skip_reason": "Conditional result was False" } 28983 1726883136.15936: no more pending results, returning what we have 28983 1726883136.15941: results queue empty 28983 1726883136.15942: checking for any_errors_fatal 28983 1726883136.15951: done checking for any_errors_fatal 28983 1726883136.15952: checking for max_fail_percentage 28983 1726883136.15954: done checking for max_fail_percentage 28983 1726883136.15955: checking to see if all hosts have failed and the running result is not ok 28983 1726883136.15956: done checking to see if all hosts have failed 28983 1726883136.15957: getting the remaining hosts for this loop 28983 1726883136.15958: done getting the remaining hosts for this loop 28983 1726883136.15963: getting the next task for host managed_node2 28983 1726883136.15976: done getting next task for host managed_node2 28983 1726883136.15980: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 28983 1726883136.15987: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883136.15991: getting variables 28983 1726883136.15992: in VariableManager get_vars() 28983 1726883136.16032: Calling all_inventory to load vars for managed_node2 28983 1726883136.16037: Calling groups_inventory to load vars for managed_node2 28983 1726883136.16040: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883136.16051: Calling all_plugins_play to load vars for managed_node2 28983 1726883136.16054: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883136.16057: Calling groups_plugins_play to load vars for managed_node2 28983 1726883136.17449: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883136.19053: done with get_vars() 28983 1726883136.19078: done getting variables 28983 1726883136.19129: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 28983 1726883136.19227: variable 'profile' from source: play vars 28983 1726883136.19230: variable 'interface' from source: play vars 28983 1726883136.19283: variable 'interface' from source: play vars TASK [Get the ansible_managed comment in ifcfg-statebr] ************************ task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 21:45:36 -0400 (0:00:00.048) 0:02:46.190 ****** 28983 1726883136.19310: entering _queue_task() for managed_node2/command 28983 1726883136.19578: worker is 1 (out of 1 available) 28983 1726883136.19595: exiting _queue_task() for managed_node2/command 28983 1726883136.19609: done queuing things up, now waiting for results queue to drain 28983 1726883136.19611: waiting for pending results... 28983 1726883136.19804: running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-statebr 28983 1726883136.19911: in run() - task 0affe814-3a2d-b16d-c0a7-00000000288e 28983 1726883136.19925: variable 'ansible_search_path' from source: unknown 28983 1726883136.19928: variable 'ansible_search_path' from source: unknown 28983 1726883136.19965: calling self._execute() 28983 1726883136.20049: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883136.20059: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883136.20075: variable 'omit' from source: magic vars 28983 1726883136.20389: variable 'ansible_distribution_major_version' from source: facts 28983 1726883136.20400: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883136.20516: variable 'profile_stat' from source: set_fact 28983 1726883136.20526: Evaluated conditional (profile_stat.stat.exists): False 28983 1726883136.20529: when evaluation is False, skipping this task 28983 1726883136.20532: _execute() done 28983 1726883136.20541: dumping result to json 28983 1726883136.20546: done dumping result, returning 28983 1726883136.20552: done running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-statebr [0affe814-3a2d-b16d-c0a7-00000000288e] 28983 1726883136.20558: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000288e 28983 1726883136.20657: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000288e 28983 1726883136.20660: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 28983 1726883136.20719: no more pending results, returning what we have 28983 1726883136.20723: results queue empty 28983 1726883136.20724: checking for any_errors_fatal 28983 1726883136.20731: done checking for any_errors_fatal 28983 1726883136.20732: checking for max_fail_percentage 28983 1726883136.20736: done checking for max_fail_percentage 28983 1726883136.20738: checking to see if all hosts have failed and the running result is not ok 28983 1726883136.20738: done checking to see if all hosts have failed 28983 1726883136.20739: getting the remaining hosts for this loop 28983 1726883136.20741: done getting the remaining hosts for this loop 28983 1726883136.20746: getting the next task for host managed_node2 28983 1726883136.20754: done getting next task for host managed_node2 28983 1726883136.20757: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 28983 1726883136.20763: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883136.20767: getting variables 28983 1726883136.20768: in VariableManager get_vars() 28983 1726883136.20811: Calling all_inventory to load vars for managed_node2 28983 1726883136.20814: Calling groups_inventory to load vars for managed_node2 28983 1726883136.20817: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883136.20827: Calling all_plugins_play to load vars for managed_node2 28983 1726883136.20830: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883136.20840: Calling groups_plugins_play to load vars for managed_node2 28983 1726883136.22229: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883136.23837: done with get_vars() 28983 1726883136.23863: done getting variables 28983 1726883136.23916: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 28983 1726883136.24011: variable 'profile' from source: play vars 28983 1726883136.24014: variable 'interface' from source: play vars 28983 1726883136.24063: variable 'interface' from source: play vars TASK [Verify the ansible_managed comment in ifcfg-statebr] ********************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 21:45:36 -0400 (0:00:00.047) 0:02:46.238 ****** 28983 1726883136.24093: entering _queue_task() for managed_node2/set_fact 28983 1726883136.24373: worker is 1 (out of 1 available) 28983 1726883136.24387: exiting _queue_task() for managed_node2/set_fact 28983 1726883136.24401: done queuing things up, now waiting for results queue to drain 28983 1726883136.24403: waiting for pending results... 28983 1726883136.24608: running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-statebr 28983 1726883136.24714: in run() - task 0affe814-3a2d-b16d-c0a7-00000000288f 28983 1726883136.24728: variable 'ansible_search_path' from source: unknown 28983 1726883136.24733: variable 'ansible_search_path' from source: unknown 28983 1726883136.24768: calling self._execute() 28983 1726883136.24864: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883136.24872: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883136.24885: variable 'omit' from source: magic vars 28983 1726883136.25210: variable 'ansible_distribution_major_version' from source: facts 28983 1726883136.25222: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883136.25330: variable 'profile_stat' from source: set_fact 28983 1726883136.25342: Evaluated conditional (profile_stat.stat.exists): False 28983 1726883136.25345: when evaluation is False, skipping this task 28983 1726883136.25349: _execute() done 28983 1726883136.25353: dumping result to json 28983 1726883136.25358: done dumping result, returning 28983 1726883136.25364: done running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-statebr [0affe814-3a2d-b16d-c0a7-00000000288f] 28983 1726883136.25370: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000288f 28983 1726883136.25477: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000288f 28983 1726883136.25480: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 28983 1726883136.25560: no more pending results, returning what we have 28983 1726883136.25564: results queue empty 28983 1726883136.25565: checking for any_errors_fatal 28983 1726883136.25573: done checking for any_errors_fatal 28983 1726883136.25574: checking for max_fail_percentage 28983 1726883136.25576: done checking for max_fail_percentage 28983 1726883136.25577: checking to see if all hosts have failed and the running result is not ok 28983 1726883136.25578: done checking to see if all hosts have failed 28983 1726883136.25579: getting the remaining hosts for this loop 28983 1726883136.25581: done getting the remaining hosts for this loop 28983 1726883136.25585: getting the next task for host managed_node2 28983 1726883136.25593: done getting next task for host managed_node2 28983 1726883136.25596: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 28983 1726883136.25601: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883136.25605: getting variables 28983 1726883136.25606: in VariableManager get_vars() 28983 1726883136.25652: Calling all_inventory to load vars for managed_node2 28983 1726883136.25656: Calling groups_inventory to load vars for managed_node2 28983 1726883136.25659: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883136.25670: Calling all_plugins_play to load vars for managed_node2 28983 1726883136.25673: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883136.25676: Calling groups_plugins_play to load vars for managed_node2 28983 1726883136.26979: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883136.28589: done with get_vars() 28983 1726883136.28616: done getting variables 28983 1726883136.28674: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 28983 1726883136.28769: variable 'profile' from source: play vars 28983 1726883136.28775: variable 'interface' from source: play vars 28983 1726883136.28822: variable 'interface' from source: play vars TASK [Get the fingerprint comment in ifcfg-statebr] **************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 21:45:36 -0400 (0:00:00.047) 0:02:46.286 ****** 28983 1726883136.28853: entering _queue_task() for managed_node2/command 28983 1726883136.29140: worker is 1 (out of 1 available) 28983 1726883136.29154: exiting _queue_task() for managed_node2/command 28983 1726883136.29169: done queuing things up, now waiting for results queue to drain 28983 1726883136.29171: waiting for pending results... 28983 1726883136.29393: running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-statebr 28983 1726883136.29497: in run() - task 0affe814-3a2d-b16d-c0a7-000000002890 28983 1726883136.29512: variable 'ansible_search_path' from source: unknown 28983 1726883136.29516: variable 'ansible_search_path' from source: unknown 28983 1726883136.29548: calling self._execute() 28983 1726883136.29639: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883136.29645: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883136.29656: variable 'omit' from source: magic vars 28983 1726883136.29982: variable 'ansible_distribution_major_version' from source: facts 28983 1726883136.29994: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883136.30102: variable 'profile_stat' from source: set_fact 28983 1726883136.30113: Evaluated conditional (profile_stat.stat.exists): False 28983 1726883136.30117: when evaluation is False, skipping this task 28983 1726883136.30122: _execute() done 28983 1726883136.30124: dumping result to json 28983 1726883136.30135: done dumping result, returning 28983 1726883136.30139: done running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-statebr [0affe814-3a2d-b16d-c0a7-000000002890] 28983 1726883136.30145: sending task result for task 0affe814-3a2d-b16d-c0a7-000000002890 28983 1726883136.30238: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000002890 28983 1726883136.30241: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 28983 1726883136.30329: no more pending results, returning what we have 28983 1726883136.30335: results queue empty 28983 1726883136.30337: checking for any_errors_fatal 28983 1726883136.30342: done checking for any_errors_fatal 28983 1726883136.30343: checking for max_fail_percentage 28983 1726883136.30345: done checking for max_fail_percentage 28983 1726883136.30346: checking to see if all hosts have failed and the running result is not ok 28983 1726883136.30347: done checking to see if all hosts have failed 28983 1726883136.30348: getting the remaining hosts for this loop 28983 1726883136.30351: done getting the remaining hosts for this loop 28983 1726883136.30356: getting the next task for host managed_node2 28983 1726883136.30364: done getting next task for host managed_node2 28983 1726883136.30367: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 28983 1726883136.30374: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883136.30378: getting variables 28983 1726883136.30380: in VariableManager get_vars() 28983 1726883136.30421: Calling all_inventory to load vars for managed_node2 28983 1726883136.30423: Calling groups_inventory to load vars for managed_node2 28983 1726883136.30427: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883136.30445: Calling all_plugins_play to load vars for managed_node2 28983 1726883136.30448: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883136.30452: Calling groups_plugins_play to load vars for managed_node2 28983 1726883136.36653: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883136.38235: done with get_vars() 28983 1726883136.38261: done getting variables 28983 1726883136.38307: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 28983 1726883136.38385: variable 'profile' from source: play vars 28983 1726883136.38388: variable 'interface' from source: play vars 28983 1726883136.38439: variable 'interface' from source: play vars TASK [Verify the fingerprint comment in ifcfg-statebr] ************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 21:45:36 -0400 (0:00:00.096) 0:02:46.382 ****** 28983 1726883136.38462: entering _queue_task() for managed_node2/set_fact 28983 1726883136.38752: worker is 1 (out of 1 available) 28983 1726883136.38767: exiting _queue_task() for managed_node2/set_fact 28983 1726883136.38780: done queuing things up, now waiting for results queue to drain 28983 1726883136.38783: waiting for pending results... 28983 1726883136.38998: running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-statebr 28983 1726883136.39130: in run() - task 0affe814-3a2d-b16d-c0a7-000000002891 28983 1726883136.39145: variable 'ansible_search_path' from source: unknown 28983 1726883136.39150: variable 'ansible_search_path' from source: unknown 28983 1726883136.39185: calling self._execute() 28983 1726883136.39269: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883136.39283: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883136.39338: variable 'omit' from source: magic vars 28983 1726883136.39624: variable 'ansible_distribution_major_version' from source: facts 28983 1726883136.39636: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883136.39746: variable 'profile_stat' from source: set_fact 28983 1726883136.39757: Evaluated conditional (profile_stat.stat.exists): False 28983 1726883136.39763: when evaluation is False, skipping this task 28983 1726883136.39767: _execute() done 28983 1726883136.39771: dumping result to json 28983 1726883136.39773: done dumping result, returning 28983 1726883136.39784: done running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-statebr [0affe814-3a2d-b16d-c0a7-000000002891] 28983 1726883136.39789: sending task result for task 0affe814-3a2d-b16d-c0a7-000000002891 28983 1726883136.39886: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000002891 28983 1726883136.39889: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 28983 1726883136.39950: no more pending results, returning what we have 28983 1726883136.39955: results queue empty 28983 1726883136.39956: checking for any_errors_fatal 28983 1726883136.39968: done checking for any_errors_fatal 28983 1726883136.39969: checking for max_fail_percentage 28983 1726883136.39971: done checking for max_fail_percentage 28983 1726883136.39972: checking to see if all hosts have failed and the running result is not ok 28983 1726883136.39973: done checking to see if all hosts have failed 28983 1726883136.39974: getting the remaining hosts for this loop 28983 1726883136.39976: done getting the remaining hosts for this loop 28983 1726883136.39980: getting the next task for host managed_node2 28983 1726883136.39992: done getting next task for host managed_node2 28983 1726883136.39996: ^ task is: TASK: Assert that the profile is absent - '{{ profile }}' 28983 1726883136.40007: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883136.40012: getting variables 28983 1726883136.40014: in VariableManager get_vars() 28983 1726883136.40060: Calling all_inventory to load vars for managed_node2 28983 1726883136.40063: Calling groups_inventory to load vars for managed_node2 28983 1726883136.40068: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883136.40079: Calling all_plugins_play to load vars for managed_node2 28983 1726883136.40084: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883136.40088: Calling groups_plugins_play to load vars for managed_node2 28983 1726883136.41462: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883136.43067: done with get_vars() 28983 1726883136.43093: done getting variables 28983 1726883136.43144: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 28983 1726883136.43242: variable 'profile' from source: play vars 28983 1726883136.43245: variable 'interface' from source: play vars 28983 1726883136.43296: variable 'interface' from source: play vars TASK [Assert that the profile is absent - 'statebr'] *************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:5 Friday 20 September 2024 21:45:36 -0400 (0:00:00.048) 0:02:46.431 ****** 28983 1726883136.43323: entering _queue_task() for managed_node2/assert 28983 1726883136.43598: worker is 1 (out of 1 available) 28983 1726883136.43614: exiting _queue_task() for managed_node2/assert 28983 1726883136.43630: done queuing things up, now waiting for results queue to drain 28983 1726883136.43632: waiting for pending results... 28983 1726883136.43837: running TaskExecutor() for managed_node2/TASK: Assert that the profile is absent - 'statebr' 28983 1726883136.43944: in run() - task 0affe814-3a2d-b16d-c0a7-000000002805 28983 1726883136.43957: variable 'ansible_search_path' from source: unknown 28983 1726883136.43960: variable 'ansible_search_path' from source: unknown 28983 1726883136.43999: calling self._execute() 28983 1726883136.44092: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883136.44098: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883136.44111: variable 'omit' from source: magic vars 28983 1726883136.44435: variable 'ansible_distribution_major_version' from source: facts 28983 1726883136.44446: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883136.44453: variable 'omit' from source: magic vars 28983 1726883136.44497: variable 'omit' from source: magic vars 28983 1726883136.44587: variable 'profile' from source: play vars 28983 1726883136.44591: variable 'interface' from source: play vars 28983 1726883136.44653: variable 'interface' from source: play vars 28983 1726883136.44670: variable 'omit' from source: magic vars 28983 1726883136.44709: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883136.44747: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883136.44766: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883136.44785: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883136.44795: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883136.44823: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883136.44826: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883136.44831: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883136.44916: Set connection var ansible_connection to ssh 28983 1726883136.44927: Set connection var ansible_shell_executable to /bin/sh 28983 1726883136.44937: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883136.44948: Set connection var ansible_timeout to 10 28983 1726883136.44954: Set connection var ansible_pipelining to False 28983 1726883136.44959: Set connection var ansible_shell_type to sh 28983 1726883136.44983: variable 'ansible_shell_executable' from source: unknown 28983 1726883136.44986: variable 'ansible_connection' from source: unknown 28983 1726883136.44989: variable 'ansible_module_compression' from source: unknown 28983 1726883136.44992: variable 'ansible_shell_type' from source: unknown 28983 1726883136.44994: variable 'ansible_shell_executable' from source: unknown 28983 1726883136.44999: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883136.45004: variable 'ansible_pipelining' from source: unknown 28983 1726883136.45007: variable 'ansible_timeout' from source: unknown 28983 1726883136.45012: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883136.45135: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883136.45146: variable 'omit' from source: magic vars 28983 1726883136.45152: starting attempt loop 28983 1726883136.45155: running the handler 28983 1726883136.45265: variable 'lsr_net_profile_exists' from source: set_fact 28983 1726883136.45269: Evaluated conditional (not lsr_net_profile_exists): True 28983 1726883136.45280: handler run complete 28983 1726883136.45299: attempt loop complete, returning result 28983 1726883136.45302: _execute() done 28983 1726883136.45305: dumping result to json 28983 1726883136.45309: done dumping result, returning 28983 1726883136.45317: done running TaskExecutor() for managed_node2/TASK: Assert that the profile is absent - 'statebr' [0affe814-3a2d-b16d-c0a7-000000002805] 28983 1726883136.45322: sending task result for task 0affe814-3a2d-b16d-c0a7-000000002805 28983 1726883136.45417: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000002805 28983 1726883136.45420: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 28983 1726883136.45476: no more pending results, returning what we have 28983 1726883136.45480: results queue empty 28983 1726883136.45480: checking for any_errors_fatal 28983 1726883136.45489: done checking for any_errors_fatal 28983 1726883136.45490: checking for max_fail_percentage 28983 1726883136.45491: done checking for max_fail_percentage 28983 1726883136.45493: checking to see if all hosts have failed and the running result is not ok 28983 1726883136.45493: done checking to see if all hosts have failed 28983 1726883136.45494: getting the remaining hosts for this loop 28983 1726883136.45496: done getting the remaining hosts for this loop 28983 1726883136.45501: getting the next task for host managed_node2 28983 1726883136.45511: done getting next task for host managed_node2 28983 1726883136.45514: ^ task is: TASK: Get NetworkManager RPM version 28983 1726883136.45519: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883136.45523: getting variables 28983 1726883136.45524: in VariableManager get_vars() 28983 1726883136.45573: Calling all_inventory to load vars for managed_node2 28983 1726883136.45577: Calling groups_inventory to load vars for managed_node2 28983 1726883136.45580: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883136.45591: Calling all_plugins_play to load vars for managed_node2 28983 1726883136.45594: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883136.45598: Calling groups_plugins_play to load vars for managed_node2 28983 1726883136.46886: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883136.48523: done with get_vars() 28983 1726883136.48547: done getting variables 28983 1726883136.48597: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NetworkManager RPM version] ****************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_NetworkManager_NVR.yml:7 Friday 20 September 2024 21:45:36 -0400 (0:00:00.053) 0:02:46.484 ****** 28983 1726883136.48627: entering _queue_task() for managed_node2/command 28983 1726883136.48861: worker is 1 (out of 1 available) 28983 1726883136.48881: exiting _queue_task() for managed_node2/command 28983 1726883136.48894: done queuing things up, now waiting for results queue to drain 28983 1726883136.48896: waiting for pending results... 28983 1726883136.49143: running TaskExecutor() for managed_node2/TASK: Get NetworkManager RPM version 28983 1726883136.49204: in run() - task 0affe814-3a2d-b16d-c0a7-000000002809 28983 1726883136.49217: variable 'ansible_search_path' from source: unknown 28983 1726883136.49220: variable 'ansible_search_path' from source: unknown 28983 1726883136.49258: calling self._execute() 28983 1726883136.49350: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883136.49356: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883136.49367: variable 'omit' from source: magic vars 28983 1726883136.49693: variable 'ansible_distribution_major_version' from source: facts 28983 1726883136.49704: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883136.49712: variable 'omit' from source: magic vars 28983 1726883136.49752: variable 'omit' from source: magic vars 28983 1726883136.49815: variable 'omit' from source: magic vars 28983 1726883136.49819: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883136.49853: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883136.49871: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883136.49891: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883136.49901: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883136.49930: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883136.49937: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883136.49940: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883136.50020: Set connection var ansible_connection to ssh 28983 1726883136.50032: Set connection var ansible_shell_executable to /bin/sh 28983 1726883136.50044: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883136.50052: Set connection var ansible_timeout to 10 28983 1726883136.50059: Set connection var ansible_pipelining to False 28983 1726883136.50062: Set connection var ansible_shell_type to sh 28983 1726883136.50084: variable 'ansible_shell_executable' from source: unknown 28983 1726883136.50087: variable 'ansible_connection' from source: unknown 28983 1726883136.50090: variable 'ansible_module_compression' from source: unknown 28983 1726883136.50093: variable 'ansible_shell_type' from source: unknown 28983 1726883136.50096: variable 'ansible_shell_executable' from source: unknown 28983 1726883136.50102: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883136.50105: variable 'ansible_pipelining' from source: unknown 28983 1726883136.50113: variable 'ansible_timeout' from source: unknown 28983 1726883136.50116: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883136.50230: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883136.50243: variable 'omit' from source: magic vars 28983 1726883136.50248: starting attempt loop 28983 1726883136.50255: running the handler 28983 1726883136.50271: _low_level_execute_command(): starting 28983 1726883136.50279: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726883136.50807: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883136.50847: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883136.50852: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883136.50854: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found <<< 28983 1726883136.50858: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883136.50912: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883136.50916: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883136.50920: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883136.50997: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883136.52764: stdout chunk (state=3): >>>/root <<< 28983 1726883136.52865: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883136.52919: stderr chunk (state=3): >>><<< 28983 1726883136.52923: stdout chunk (state=3): >>><<< 28983 1726883136.52949: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883136.52961: _low_level_execute_command(): starting 28983 1726883136.52967: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726883136.5294933-34766-84804597678005 `" && echo ansible-tmp-1726883136.5294933-34766-84804597678005="` echo /root/.ansible/tmp/ansible-tmp-1726883136.5294933-34766-84804597678005 `" ) && sleep 0' 28983 1726883136.53428: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883136.53431: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726883136.53434: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration <<< 28983 1726883136.53445: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found <<< 28983 1726883136.53448: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883136.53497: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883136.53502: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883136.53578: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883136.55594: stdout chunk (state=3): >>>ansible-tmp-1726883136.5294933-34766-84804597678005=/root/.ansible/tmp/ansible-tmp-1726883136.5294933-34766-84804597678005 <<< 28983 1726883136.55709: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883136.55757: stderr chunk (state=3): >>><<< 28983 1726883136.55760: stdout chunk (state=3): >>><<< 28983 1726883136.55780: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726883136.5294933-34766-84804597678005=/root/.ansible/tmp/ansible-tmp-1726883136.5294933-34766-84804597678005 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883136.55805: variable 'ansible_module_compression' from source: unknown 28983 1726883136.55852: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 28983 1726883136.55887: variable 'ansible_facts' from source: unknown 28983 1726883136.55953: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726883136.5294933-34766-84804597678005/AnsiballZ_command.py 28983 1726883136.56062: Sending initial data 28983 1726883136.56065: Sent initial data (155 bytes) 28983 1726883136.56533: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883136.56538: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726883136.56541: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883136.56543: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883136.56546: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883136.56599: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883136.56606: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883136.56677: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883136.58303: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 28983 1726883136.58307: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726883136.58367: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726883136.58436: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmptq8ntdqx /root/.ansible/tmp/ansible-tmp-1726883136.5294933-34766-84804597678005/AnsiballZ_command.py <<< 28983 1726883136.58440: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726883136.5294933-34766-84804597678005/AnsiballZ_command.py" <<< 28983 1726883136.58506: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmptq8ntdqx" to remote "/root/.ansible/tmp/ansible-tmp-1726883136.5294933-34766-84804597678005/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726883136.5294933-34766-84804597678005/AnsiballZ_command.py" <<< 28983 1726883136.59402: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883136.59469: stderr chunk (state=3): >>><<< 28983 1726883136.59473: stdout chunk (state=3): >>><<< 28983 1726883136.59497: done transferring module to remote 28983 1726883136.59507: _low_level_execute_command(): starting 28983 1726883136.59512: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726883136.5294933-34766-84804597678005/ /root/.ansible/tmp/ansible-tmp-1726883136.5294933-34766-84804597678005/AnsiballZ_command.py && sleep 0' 28983 1726883136.59976: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883136.59979: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726883136.59982: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 28983 1726883136.59984: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726883136.59986: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883136.60045: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883136.60050: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883136.60117: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883136.61995: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883136.62039: stderr chunk (state=3): >>><<< 28983 1726883136.62042: stdout chunk (state=3): >>><<< 28983 1726883136.62058: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883136.62061: _low_level_execute_command(): starting 28983 1726883136.62067: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726883136.5294933-34766-84804597678005/AnsiballZ_command.py && sleep 0' 28983 1726883136.62520: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883136.62523: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883136.62525: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883136.62528: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883136.62588: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883136.62591: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883136.62665: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883137.11227: stdout chunk (state=3): >>> {"changed": true, "stdout": "NetworkManager-1.44.2-1.fc39", "stderr": "", "rc": 0, "cmd": ["rpm", "-qa", "--qf", "%{name}-%{version}-%{release}\\n", "NetworkManager"], "start": "2024-09-20 21:45:36.796917", "end": "2024-09-20 21:45:37.110967", "delta": "0:00:00.314050", "msg": "", "invocation": {"module_args": {"_raw_params": "rpm -qa --qf '%{name}-%{version}-%{release}\\n' NetworkManager", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 28983 1726883137.12908: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. <<< 28983 1726883137.12974: stderr chunk (state=3): >>><<< 28983 1726883137.12978: stdout chunk (state=3): >>><<< 28983 1726883137.12999: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "NetworkManager-1.44.2-1.fc39", "stderr": "", "rc": 0, "cmd": ["rpm", "-qa", "--qf", "%{name}-%{version}-%{release}\\n", "NetworkManager"], "start": "2024-09-20 21:45:36.796917", "end": "2024-09-20 21:45:37.110967", "delta": "0:00:00.314050", "msg": "", "invocation": {"module_args": {"_raw_params": "rpm -qa --qf '%{name}-%{version}-%{release}\\n' NetworkManager", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726883137.13041: done with _execute_module (ansible.legacy.command, {'_raw_params': "rpm -qa --qf '%{name}-%{version}-%{release}\\n' NetworkManager", '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726883136.5294933-34766-84804597678005/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726883137.13050: _low_level_execute_command(): starting 28983 1726883137.13056: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726883136.5294933-34766-84804597678005/ > /dev/null 2>&1 && sleep 0' 28983 1726883137.13540: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883137.13543: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726883137.13546: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address <<< 28983 1726883137.13548: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883137.13560: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883137.13601: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883137.13625: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883137.13686: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883137.15613: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883137.15665: stderr chunk (state=3): >>><<< 28983 1726883137.15668: stdout chunk (state=3): >>><<< 28983 1726883137.15688: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883137.15694: handler run complete 28983 1726883137.15717: Evaluated conditional (False): False 28983 1726883137.15727: attempt loop complete, returning result 28983 1726883137.15731: _execute() done 28983 1726883137.15740: dumping result to json 28983 1726883137.15745: done dumping result, returning 28983 1726883137.15753: done running TaskExecutor() for managed_node2/TASK: Get NetworkManager RPM version [0affe814-3a2d-b16d-c0a7-000000002809] 28983 1726883137.15762: sending task result for task 0affe814-3a2d-b16d-c0a7-000000002809 28983 1726883137.15878: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000002809 28983 1726883137.15881: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "rpm", "-qa", "--qf", "%{name}-%{version}-%{release}\\n", "NetworkManager" ], "delta": "0:00:00.314050", "end": "2024-09-20 21:45:37.110967", "rc": 0, "start": "2024-09-20 21:45:36.796917" } STDOUT: NetworkManager-1.44.2-1.fc39 28983 1726883137.15981: no more pending results, returning what we have 28983 1726883137.15985: results queue empty 28983 1726883137.15986: checking for any_errors_fatal 28983 1726883137.16000: done checking for any_errors_fatal 28983 1726883137.16002: checking for max_fail_percentage 28983 1726883137.16004: done checking for max_fail_percentage 28983 1726883137.16005: checking to see if all hosts have failed and the running result is not ok 28983 1726883137.16006: done checking to see if all hosts have failed 28983 1726883137.16007: getting the remaining hosts for this loop 28983 1726883137.16009: done getting the remaining hosts for this loop 28983 1726883137.16014: getting the next task for host managed_node2 28983 1726883137.16022: done getting next task for host managed_node2 28983 1726883137.16025: ^ task is: TASK: Store NetworkManager version 28983 1726883137.16029: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883137.16033: getting variables 28983 1726883137.16037: in VariableManager get_vars() 28983 1726883137.16087: Calling all_inventory to load vars for managed_node2 28983 1726883137.16091: Calling groups_inventory to load vars for managed_node2 28983 1726883137.16095: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883137.16112: Calling all_plugins_play to load vars for managed_node2 28983 1726883137.16116: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883137.16120: Calling groups_plugins_play to load vars for managed_node2 28983 1726883137.17586: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883137.19186: done with get_vars() 28983 1726883137.19210: done getting variables 28983 1726883137.19265: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Store NetworkManager version] ******************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_NetworkManager_NVR.yml:14 Friday 20 September 2024 21:45:37 -0400 (0:00:00.706) 0:02:47.190 ****** 28983 1726883137.19295: entering _queue_task() for managed_node2/set_fact 28983 1726883137.19540: worker is 1 (out of 1 available) 28983 1726883137.19555: exiting _queue_task() for managed_node2/set_fact 28983 1726883137.19568: done queuing things up, now waiting for results queue to drain 28983 1726883137.19570: waiting for pending results... 28983 1726883137.19778: running TaskExecutor() for managed_node2/TASK: Store NetworkManager version 28983 1726883137.19888: in run() - task 0affe814-3a2d-b16d-c0a7-00000000280a 28983 1726883137.19901: variable 'ansible_search_path' from source: unknown 28983 1726883137.19905: variable 'ansible_search_path' from source: unknown 28983 1726883137.19942: calling self._execute() 28983 1726883137.20032: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883137.20039: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883137.20051: variable 'omit' from source: magic vars 28983 1726883137.20384: variable 'ansible_distribution_major_version' from source: facts 28983 1726883137.20394: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883137.20402: variable 'omit' from source: magic vars 28983 1726883137.20444: variable 'omit' from source: magic vars 28983 1726883137.20544: variable '__rpm_q_networkmanager' from source: set_fact 28983 1726883137.20567: variable 'omit' from source: magic vars 28983 1726883137.20610: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883137.20642: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883137.20660: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883137.20680: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883137.20692: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883137.20720: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883137.20724: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883137.20729: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883137.20817: Set connection var ansible_connection to ssh 28983 1726883137.20827: Set connection var ansible_shell_executable to /bin/sh 28983 1726883137.20838: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883137.20846: Set connection var ansible_timeout to 10 28983 1726883137.20854: Set connection var ansible_pipelining to False 28983 1726883137.20857: Set connection var ansible_shell_type to sh 28983 1726883137.20879: variable 'ansible_shell_executable' from source: unknown 28983 1726883137.20884: variable 'ansible_connection' from source: unknown 28983 1726883137.20886: variable 'ansible_module_compression' from source: unknown 28983 1726883137.20888: variable 'ansible_shell_type' from source: unknown 28983 1726883137.20893: variable 'ansible_shell_executable' from source: unknown 28983 1726883137.20897: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883137.20901: variable 'ansible_pipelining' from source: unknown 28983 1726883137.20911: variable 'ansible_timeout' from source: unknown 28983 1726883137.20913: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883137.21031: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883137.21042: variable 'omit' from source: magic vars 28983 1726883137.21047: starting attempt loop 28983 1726883137.21050: running the handler 28983 1726883137.21064: handler run complete 28983 1726883137.21073: attempt loop complete, returning result 28983 1726883137.21079: _execute() done 28983 1726883137.21082: dumping result to json 28983 1726883137.21087: done dumping result, returning 28983 1726883137.21094: done running TaskExecutor() for managed_node2/TASK: Store NetworkManager version [0affe814-3a2d-b16d-c0a7-00000000280a] 28983 1726883137.21099: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000280a 28983 1726883137.21191: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000280a 28983 1726883137.21194: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "networkmanager_nvr": "NetworkManager-1.44.2-1.fc39" }, "changed": false } 28983 1726883137.21254: no more pending results, returning what we have 28983 1726883137.21258: results queue empty 28983 1726883137.21259: checking for any_errors_fatal 28983 1726883137.21270: done checking for any_errors_fatal 28983 1726883137.21270: checking for max_fail_percentage 28983 1726883137.21272: done checking for max_fail_percentage 28983 1726883137.21273: checking to see if all hosts have failed and the running result is not ok 28983 1726883137.21274: done checking to see if all hosts have failed 28983 1726883137.21275: getting the remaining hosts for this loop 28983 1726883137.21277: done getting the remaining hosts for this loop 28983 1726883137.21282: getting the next task for host managed_node2 28983 1726883137.21290: done getting next task for host managed_node2 28983 1726883137.21292: ^ task is: TASK: Show NetworkManager version 28983 1726883137.21297: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883137.21300: getting variables 28983 1726883137.21302: in VariableManager get_vars() 28983 1726883137.21350: Calling all_inventory to load vars for managed_node2 28983 1726883137.21354: Calling groups_inventory to load vars for managed_node2 28983 1726883137.21357: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883137.21366: Calling all_plugins_play to load vars for managed_node2 28983 1726883137.21369: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883137.21373: Calling groups_plugins_play to load vars for managed_node2 28983 1726883137.22706: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883137.24306: done with get_vars() 28983 1726883137.24328: done getting variables 28983 1726883137.24374: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show NetworkManager version] ********************************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_NetworkManager_NVR.yml:18 Friday 20 September 2024 21:45:37 -0400 (0:00:00.051) 0:02:47.241 ****** 28983 1726883137.24403: entering _queue_task() for managed_node2/debug 28983 1726883137.24618: worker is 1 (out of 1 available) 28983 1726883137.24635: exiting _queue_task() for managed_node2/debug 28983 1726883137.24648: done queuing things up, now waiting for results queue to drain 28983 1726883137.24650: waiting for pending results... 28983 1726883137.24837: running TaskExecutor() for managed_node2/TASK: Show NetworkManager version 28983 1726883137.24944: in run() - task 0affe814-3a2d-b16d-c0a7-00000000280b 28983 1726883137.24958: variable 'ansible_search_path' from source: unknown 28983 1726883137.24962: variable 'ansible_search_path' from source: unknown 28983 1726883137.25000: calling self._execute() 28983 1726883137.25091: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883137.25096: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883137.25109: variable 'omit' from source: magic vars 28983 1726883137.25426: variable 'ansible_distribution_major_version' from source: facts 28983 1726883137.25438: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883137.25448: variable 'omit' from source: magic vars 28983 1726883137.25489: variable 'omit' from source: magic vars 28983 1726883137.25518: variable 'omit' from source: magic vars 28983 1726883137.25557: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883137.25590: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883137.25609: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883137.25625: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883137.25636: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883137.25667: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883137.25670: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883137.25676: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883137.25758: Set connection var ansible_connection to ssh 28983 1726883137.25774: Set connection var ansible_shell_executable to /bin/sh 28983 1726883137.25780: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883137.25788: Set connection var ansible_timeout to 10 28983 1726883137.25795: Set connection var ansible_pipelining to False 28983 1726883137.25797: Set connection var ansible_shell_type to sh 28983 1726883137.25816: variable 'ansible_shell_executable' from source: unknown 28983 1726883137.25819: variable 'ansible_connection' from source: unknown 28983 1726883137.25822: variable 'ansible_module_compression' from source: unknown 28983 1726883137.25827: variable 'ansible_shell_type' from source: unknown 28983 1726883137.25829: variable 'ansible_shell_executable' from source: unknown 28983 1726883137.25835: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883137.25841: variable 'ansible_pipelining' from source: unknown 28983 1726883137.25845: variable 'ansible_timeout' from source: unknown 28983 1726883137.25850: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883137.25968: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883137.25985: variable 'omit' from source: magic vars 28983 1726883137.25990: starting attempt loop 28983 1726883137.25993: running the handler 28983 1726883137.26035: variable 'networkmanager_nvr' from source: set_fact 28983 1726883137.26106: variable 'networkmanager_nvr' from source: set_fact 28983 1726883137.26116: handler run complete 28983 1726883137.26131: attempt loop complete, returning result 28983 1726883137.26136: _execute() done 28983 1726883137.26139: dumping result to json 28983 1726883137.26144: done dumping result, returning 28983 1726883137.26152: done running TaskExecutor() for managed_node2/TASK: Show NetworkManager version [0affe814-3a2d-b16d-c0a7-00000000280b] 28983 1726883137.26157: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000280b 28983 1726883137.26249: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000280b 28983 1726883137.26252: WORKER PROCESS EXITING ok: [managed_node2] => { "networkmanager_nvr": "NetworkManager-1.44.2-1.fc39" } 28983 1726883137.26306: no more pending results, returning what we have 28983 1726883137.26310: results queue empty 28983 1726883137.26311: checking for any_errors_fatal 28983 1726883137.26316: done checking for any_errors_fatal 28983 1726883137.26317: checking for max_fail_percentage 28983 1726883137.26319: done checking for max_fail_percentage 28983 1726883137.26320: checking to see if all hosts have failed and the running result is not ok 28983 1726883137.26321: done checking to see if all hosts have failed 28983 1726883137.26322: getting the remaining hosts for this loop 28983 1726883137.26324: done getting the remaining hosts for this loop 28983 1726883137.26328: getting the next task for host managed_node2 28983 1726883137.26343: done getting next task for host managed_node2 28983 1726883137.26347: ^ task is: TASK: Conditional asserts 28983 1726883137.26351: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883137.26355: getting variables 28983 1726883137.26356: in VariableManager get_vars() 28983 1726883137.26402: Calling all_inventory to load vars for managed_node2 28983 1726883137.26405: Calling groups_inventory to load vars for managed_node2 28983 1726883137.26409: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883137.26418: Calling all_plugins_play to load vars for managed_node2 28983 1726883137.26421: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883137.26425: Calling groups_plugins_play to load vars for managed_node2 28983 1726883137.27657: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883137.29267: done with get_vars() 28983 1726883137.29294: done getting variables TASK [Conditional asserts] ***************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:42 Friday 20 September 2024 21:45:37 -0400 (0:00:00.049) 0:02:47.291 ****** 28983 1726883137.29368: entering _queue_task() for managed_node2/include_tasks 28983 1726883137.29588: worker is 1 (out of 1 available) 28983 1726883137.29601: exiting _queue_task() for managed_node2/include_tasks 28983 1726883137.29615: done queuing things up, now waiting for results queue to drain 28983 1726883137.29617: waiting for pending results... 28983 1726883137.29798: running TaskExecutor() for managed_node2/TASK: Conditional asserts 28983 1726883137.29892: in run() - task 0affe814-3a2d-b16d-c0a7-0000000020b3 28983 1726883137.29903: variable 'ansible_search_path' from source: unknown 28983 1726883137.29907: variable 'ansible_search_path' from source: unknown 28983 1726883137.30144: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 28983 1726883137.32176: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 28983 1726883137.32227: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 28983 1726883137.32261: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 28983 1726883137.32293: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 28983 1726883137.32315: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 28983 1726883137.32386: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 28983 1726883137.32410: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 28983 1726883137.32432: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 28983 1726883137.32471: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 28983 1726883137.32487: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 28983 1726883137.32581: variable 'lsr_assert_when' from source: include params 28983 1726883137.32670: variable 'network_provider' from source: set_fact 28983 1726883137.32742: variable 'omit' from source: magic vars 28983 1726883137.32820: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883137.32829: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883137.32841: variable 'omit' from source: magic vars 28983 1726883137.33009: variable 'ansible_distribution_major_version' from source: facts 28983 1726883137.33017: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883137.33112: variable 'item' from source: unknown 28983 1726883137.33119: Evaluated conditional (item['condition']): True 28983 1726883137.33188: variable 'item' from source: unknown 28983 1726883137.33229: variable 'item' from source: unknown 28983 1726883137.33276: variable 'item' from source: unknown 28983 1726883137.33436: dumping result to json 28983 1726883137.33439: done dumping result, returning 28983 1726883137.33442: done running TaskExecutor() for managed_node2/TASK: Conditional asserts [0affe814-3a2d-b16d-c0a7-0000000020b3] 28983 1726883137.33444: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000020b3 28983 1726883137.33496: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000020b3 28983 1726883137.33500: WORKER PROCESS EXITING 28983 1726883137.33575: no more pending results, returning what we have 28983 1726883137.33580: in VariableManager get_vars() 28983 1726883137.33622: Calling all_inventory to load vars for managed_node2 28983 1726883137.33625: Calling groups_inventory to load vars for managed_node2 28983 1726883137.33629: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883137.33641: Calling all_plugins_play to load vars for managed_node2 28983 1726883137.33644: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883137.33647: Calling groups_plugins_play to load vars for managed_node2 28983 1726883137.35023: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883137.36599: done with get_vars() 28983 1726883137.36622: variable 'ansible_search_path' from source: unknown 28983 1726883137.36624: variable 'ansible_search_path' from source: unknown 28983 1726883137.36656: we have included files to process 28983 1726883137.36657: generating all_blocks data 28983 1726883137.36659: done generating all_blocks data 28983 1726883137.36663: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 28983 1726883137.36664: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 28983 1726883137.36666: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 28983 1726883137.36754: in VariableManager get_vars() 28983 1726883137.36773: done with get_vars() 28983 1726883137.36865: done processing included file 28983 1726883137.36867: iterating over new_blocks loaded from include file 28983 1726883137.36868: in VariableManager get_vars() 28983 1726883137.36883: done with get_vars() 28983 1726883137.36885: filtering new block on tags 28983 1726883137.36911: done filtering new block on tags 28983 1726883137.36913: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed_node2 => (item={'what': 'tasks/assert_device_absent.yml', 'condition': True}) 28983 1726883137.36918: extending task lists for all hosts with included blocks 28983 1726883137.38026: done extending task lists 28983 1726883137.38028: done processing included files 28983 1726883137.38028: results queue empty 28983 1726883137.38029: checking for any_errors_fatal 28983 1726883137.38031: done checking for any_errors_fatal 28983 1726883137.38032: checking for max_fail_percentage 28983 1726883137.38033: done checking for max_fail_percentage 28983 1726883137.38035: checking to see if all hosts have failed and the running result is not ok 28983 1726883137.38036: done checking to see if all hosts have failed 28983 1726883137.38036: getting the remaining hosts for this loop 28983 1726883137.38038: done getting the remaining hosts for this loop 28983 1726883137.38040: getting the next task for host managed_node2 28983 1726883137.38043: done getting next task for host managed_node2 28983 1726883137.38045: ^ task is: TASK: Include the task 'get_interface_stat.yml' 28983 1726883137.38047: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883137.38053: getting variables 28983 1726883137.38054: in VariableManager get_vars() 28983 1726883137.38124: Calling all_inventory to load vars for managed_node2 28983 1726883137.38127: Calling groups_inventory to load vars for managed_node2 28983 1726883137.38129: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883137.38137: Calling all_plugins_play to load vars for managed_node2 28983 1726883137.38139: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883137.38142: Calling groups_plugins_play to load vars for managed_node2 28983 1726883137.39261: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883137.40856: done with get_vars() 28983 1726883137.40882: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Friday 20 September 2024 21:45:37 -0400 (0:00:00.115) 0:02:47.407 ****** 28983 1726883137.40950: entering _queue_task() for managed_node2/include_tasks 28983 1726883137.41243: worker is 1 (out of 1 available) 28983 1726883137.41258: exiting _queue_task() for managed_node2/include_tasks 28983 1726883137.41271: done queuing things up, now waiting for results queue to drain 28983 1726883137.41273: waiting for pending results... 28983 1726883137.41480: running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' 28983 1726883137.41583: in run() - task 0affe814-3a2d-b16d-c0a7-0000000028d3 28983 1726883137.41596: variable 'ansible_search_path' from source: unknown 28983 1726883137.41599: variable 'ansible_search_path' from source: unknown 28983 1726883137.41640: calling self._execute() 28983 1726883137.41741: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883137.41747: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883137.41758: variable 'omit' from source: magic vars 28983 1726883137.42104: variable 'ansible_distribution_major_version' from source: facts 28983 1726883137.42115: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883137.42121: _execute() done 28983 1726883137.42126: dumping result to json 28983 1726883137.42131: done dumping result, returning 28983 1726883137.42164: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' [0affe814-3a2d-b16d-c0a7-0000000028d3] 28983 1726883137.42169: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000028d3 28983 1726883137.42242: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000028d3 28983 1726883137.42245: WORKER PROCESS EXITING 28983 1726883137.42294: no more pending results, returning what we have 28983 1726883137.42300: in VariableManager get_vars() 28983 1726883137.42365: Calling all_inventory to load vars for managed_node2 28983 1726883137.42368: Calling groups_inventory to load vars for managed_node2 28983 1726883137.42373: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883137.42386: Calling all_plugins_play to load vars for managed_node2 28983 1726883137.42389: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883137.42392: Calling groups_plugins_play to load vars for managed_node2 28983 1726883137.43702: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883137.45411: done with get_vars() 28983 1726883137.45432: variable 'ansible_search_path' from source: unknown 28983 1726883137.45433: variable 'ansible_search_path' from source: unknown 28983 1726883137.45552: variable 'item' from source: include params 28983 1726883137.45583: we have included files to process 28983 1726883137.45584: generating all_blocks data 28983 1726883137.45586: done generating all_blocks data 28983 1726883137.45587: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 28983 1726883137.45588: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 28983 1726883137.45590: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 28983 1726883137.45745: done processing included file 28983 1726883137.45747: iterating over new_blocks loaded from include file 28983 1726883137.45748: in VariableManager get_vars() 28983 1726883137.45764: done with get_vars() 28983 1726883137.45765: filtering new block on tags 28983 1726883137.45786: done filtering new block on tags 28983 1726883137.45788: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node2 28983 1726883137.45792: extending task lists for all hosts with included blocks 28983 1726883137.45919: done extending task lists 28983 1726883137.45921: done processing included files 28983 1726883137.45921: results queue empty 28983 1726883137.45922: checking for any_errors_fatal 28983 1726883137.45926: done checking for any_errors_fatal 28983 1726883137.45926: checking for max_fail_percentage 28983 1726883137.45927: done checking for max_fail_percentage 28983 1726883137.45928: checking to see if all hosts have failed and the running result is not ok 28983 1726883137.45929: done checking to see if all hosts have failed 28983 1726883137.45929: getting the remaining hosts for this loop 28983 1726883137.45930: done getting the remaining hosts for this loop 28983 1726883137.45932: getting the next task for host managed_node2 28983 1726883137.45938: done getting next task for host managed_node2 28983 1726883137.45940: ^ task is: TASK: Get stat for interface {{ interface }} 28983 1726883137.45942: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883137.45944: getting variables 28983 1726883137.45945: in VariableManager get_vars() 28983 1726883137.45955: Calling all_inventory to load vars for managed_node2 28983 1726883137.45956: Calling groups_inventory to load vars for managed_node2 28983 1726883137.45958: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883137.45963: Calling all_plugins_play to load vars for managed_node2 28983 1726883137.45965: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883137.45967: Calling groups_plugins_play to load vars for managed_node2 28983 1726883137.47082: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883137.48655: done with get_vars() 28983 1726883137.48679: done getting variables 28983 1726883137.48788: variable 'interface' from source: play vars TASK [Get stat for interface statebr] ****************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:45:37 -0400 (0:00:00.078) 0:02:47.485 ****** 28983 1726883137.48814: entering _queue_task() for managed_node2/stat 28983 1726883137.49110: worker is 1 (out of 1 available) 28983 1726883137.49127: exiting _queue_task() for managed_node2/stat 28983 1726883137.49143: done queuing things up, now waiting for results queue to drain 28983 1726883137.49145: waiting for pending results... 28983 1726883137.49370: running TaskExecutor() for managed_node2/TASK: Get stat for interface statebr 28983 1726883137.49481: in run() - task 0affe814-3a2d-b16d-c0a7-000000002979 28983 1726883137.49498: variable 'ansible_search_path' from source: unknown 28983 1726883137.49502: variable 'ansible_search_path' from source: unknown 28983 1726883137.49531: calling self._execute() 28983 1726883137.49624: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883137.49631: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883137.49645: variable 'omit' from source: magic vars 28983 1726883137.49971: variable 'ansible_distribution_major_version' from source: facts 28983 1726883137.49983: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883137.49991: variable 'omit' from source: magic vars 28983 1726883137.50050: variable 'omit' from source: magic vars 28983 1726883137.50137: variable 'interface' from source: play vars 28983 1726883137.50157: variable 'omit' from source: magic vars 28983 1726883137.50200: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883137.50232: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883137.50256: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883137.50277: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883137.50287: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883137.50315: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883137.50319: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883137.50324: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883137.50410: Set connection var ansible_connection to ssh 28983 1726883137.50421: Set connection var ansible_shell_executable to /bin/sh 28983 1726883137.50429: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883137.50440: Set connection var ansible_timeout to 10 28983 1726883137.50446: Set connection var ansible_pipelining to False 28983 1726883137.50449: Set connection var ansible_shell_type to sh 28983 1726883137.50469: variable 'ansible_shell_executable' from source: unknown 28983 1726883137.50476: variable 'ansible_connection' from source: unknown 28983 1726883137.50479: variable 'ansible_module_compression' from source: unknown 28983 1726883137.50482: variable 'ansible_shell_type' from source: unknown 28983 1726883137.50485: variable 'ansible_shell_executable' from source: unknown 28983 1726883137.50489: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883137.50491: variable 'ansible_pipelining' from source: unknown 28983 1726883137.50502: variable 'ansible_timeout' from source: unknown 28983 1726883137.50504: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883137.50678: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 28983 1726883137.50686: variable 'omit' from source: magic vars 28983 1726883137.50692: starting attempt loop 28983 1726883137.50695: running the handler 28983 1726883137.50711: _low_level_execute_command(): starting 28983 1726883137.50722: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726883137.51286: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883137.51290: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883137.51294: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found <<< 28983 1726883137.51297: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883137.51340: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883137.51344: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883137.51359: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883137.51431: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883137.53196: stdout chunk (state=3): >>>/root <<< 28983 1726883137.53308: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883137.53369: stderr chunk (state=3): >>><<< 28983 1726883137.53375: stdout chunk (state=3): >>><<< 28983 1726883137.53396: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883137.53409: _low_level_execute_command(): starting 28983 1726883137.53416: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726883137.5339699-34781-90176789776265 `" && echo ansible-tmp-1726883137.5339699-34781-90176789776265="` echo /root/.ansible/tmp/ansible-tmp-1726883137.5339699-34781-90176789776265 `" ) && sleep 0' 28983 1726883137.53902: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883137.53905: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883137.53908: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration <<< 28983 1726883137.53919: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found <<< 28983 1726883137.53922: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883137.53966: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883137.53970: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883137.54051: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883137.56037: stdout chunk (state=3): >>>ansible-tmp-1726883137.5339699-34781-90176789776265=/root/.ansible/tmp/ansible-tmp-1726883137.5339699-34781-90176789776265 <<< 28983 1726883137.56152: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883137.56203: stderr chunk (state=3): >>><<< 28983 1726883137.56206: stdout chunk (state=3): >>><<< 28983 1726883137.56220: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726883137.5339699-34781-90176789776265=/root/.ansible/tmp/ansible-tmp-1726883137.5339699-34781-90176789776265 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883137.56262: variable 'ansible_module_compression' from source: unknown 28983 1726883137.56314: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 28983 1726883137.56348: variable 'ansible_facts' from source: unknown 28983 1726883137.56416: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726883137.5339699-34781-90176789776265/AnsiballZ_stat.py 28983 1726883137.56526: Sending initial data 28983 1726883137.56530: Sent initial data (152 bytes) 28983 1726883137.56987: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883137.56990: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726883137.56992: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration <<< 28983 1726883137.56996: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883137.57056: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883137.57059: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883137.57125: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883137.58756: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 28983 1726883137.58761: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726883137.58827: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726883137.58893: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpjwifjqpi /root/.ansible/tmp/ansible-tmp-1726883137.5339699-34781-90176789776265/AnsiballZ_stat.py <<< 28983 1726883137.58897: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726883137.5339699-34781-90176789776265/AnsiballZ_stat.py" <<< 28983 1726883137.58958: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpjwifjqpi" to remote "/root/.ansible/tmp/ansible-tmp-1726883137.5339699-34781-90176789776265/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726883137.5339699-34781-90176789776265/AnsiballZ_stat.py" <<< 28983 1726883137.59865: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883137.59922: stderr chunk (state=3): >>><<< 28983 1726883137.59926: stdout chunk (state=3): >>><<< 28983 1726883137.59947: done transferring module to remote 28983 1726883137.59958: _low_level_execute_command(): starting 28983 1726883137.59961: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726883137.5339699-34781-90176789776265/ /root/.ansible/tmp/ansible-tmp-1726883137.5339699-34781-90176789776265/AnsiballZ_stat.py && sleep 0' 28983 1726883137.60411: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883137.60414: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883137.60416: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883137.60419: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883137.60421: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883137.60483: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883137.60490: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883137.60560: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883137.62419: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883137.62462: stderr chunk (state=3): >>><<< 28983 1726883137.62465: stdout chunk (state=3): >>><<< 28983 1726883137.62483: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883137.62488: _low_level_execute_command(): starting 28983 1726883137.62491: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726883137.5339699-34781-90176789776265/AnsiballZ_stat.py && sleep 0' 28983 1726883137.62920: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883137.62923: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883137.62927: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883137.62930: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883137.62986: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883137.62991: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883137.63066: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883137.80352: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 28983 1726883137.81820: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. <<< 28983 1726883137.81887: stderr chunk (state=3): >>><<< 28983 1726883137.81890: stdout chunk (state=3): >>><<< 28983 1726883137.81908: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726883137.81941: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726883137.5339699-34781-90176789776265/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726883137.81952: _low_level_execute_command(): starting 28983 1726883137.81956: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726883137.5339699-34781-90176789776265/ > /dev/null 2>&1 && sleep 0' 28983 1726883137.82451: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883137.82455: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726883137.82457: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration <<< 28983 1726883137.82460: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 28983 1726883137.82462: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883137.82518: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883137.82525: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883137.82595: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883137.84541: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883137.84594: stderr chunk (state=3): >>><<< 28983 1726883137.84597: stdout chunk (state=3): >>><<< 28983 1726883137.84613: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883137.84621: handler run complete 28983 1726883137.84647: attempt loop complete, returning result 28983 1726883137.84650: _execute() done 28983 1726883137.84653: dumping result to json 28983 1726883137.84658: done dumping result, returning 28983 1726883137.84670: done running TaskExecutor() for managed_node2/TASK: Get stat for interface statebr [0affe814-3a2d-b16d-c0a7-000000002979] 28983 1726883137.84740: sending task result for task 0affe814-3a2d-b16d-c0a7-000000002979 28983 1726883137.84813: done sending task result for task 0affe814-3a2d-b16d-c0a7-000000002979 28983 1726883137.84816: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 28983 1726883137.84895: no more pending results, returning what we have 28983 1726883137.84899: results queue empty 28983 1726883137.84900: checking for any_errors_fatal 28983 1726883137.84903: done checking for any_errors_fatal 28983 1726883137.84903: checking for max_fail_percentage 28983 1726883137.84905: done checking for max_fail_percentage 28983 1726883137.84907: checking to see if all hosts have failed and the running result is not ok 28983 1726883137.84907: done checking to see if all hosts have failed 28983 1726883137.84908: getting the remaining hosts for this loop 28983 1726883137.84910: done getting the remaining hosts for this loop 28983 1726883137.84915: getting the next task for host managed_node2 28983 1726883137.84928: done getting next task for host managed_node2 28983 1726883137.84931: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 28983 1726883137.84938: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883137.84943: getting variables 28983 1726883137.84945: in VariableManager get_vars() 28983 1726883137.84995: Calling all_inventory to load vars for managed_node2 28983 1726883137.84998: Calling groups_inventory to load vars for managed_node2 28983 1726883137.85002: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883137.85013: Calling all_plugins_play to load vars for managed_node2 28983 1726883137.85016: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883137.85019: Calling groups_plugins_play to load vars for managed_node2 28983 1726883137.86474: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883137.88077: done with get_vars() 28983 1726883137.88106: done getting variables 28983 1726883137.88160: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 28983 1726883137.88264: variable 'interface' from source: play vars TASK [Assert that the interface is absent - 'statebr'] ************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Friday 20 September 2024 21:45:37 -0400 (0:00:00.394) 0:02:47.880 ****** 28983 1726883137.88294: entering _queue_task() for managed_node2/assert 28983 1726883137.88583: worker is 1 (out of 1 available) 28983 1726883137.88599: exiting _queue_task() for managed_node2/assert 28983 1726883137.88613: done queuing things up, now waiting for results queue to drain 28983 1726883137.88621: waiting for pending results... 28983 1726883137.88835: running TaskExecutor() for managed_node2/TASK: Assert that the interface is absent - 'statebr' 28983 1726883137.88943: in run() - task 0affe814-3a2d-b16d-c0a7-0000000028d4 28983 1726883137.88957: variable 'ansible_search_path' from source: unknown 28983 1726883137.88963: variable 'ansible_search_path' from source: unknown 28983 1726883137.88999: calling self._execute() 28983 1726883137.89090: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883137.89096: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883137.89107: variable 'omit' from source: magic vars 28983 1726883137.89441: variable 'ansible_distribution_major_version' from source: facts 28983 1726883137.89452: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883137.89459: variable 'omit' from source: magic vars 28983 1726883137.89500: variable 'omit' from source: magic vars 28983 1726883137.89584: variable 'interface' from source: play vars 28983 1726883137.89601: variable 'omit' from source: magic vars 28983 1726883137.89645: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883137.89681: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883137.89701: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883137.89717: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883137.89727: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883137.89760: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883137.89764: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883137.89767: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883137.89855: Set connection var ansible_connection to ssh 28983 1726883137.89864: Set connection var ansible_shell_executable to /bin/sh 28983 1726883137.89873: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883137.89883: Set connection var ansible_timeout to 10 28983 1726883137.89890: Set connection var ansible_pipelining to False 28983 1726883137.89893: Set connection var ansible_shell_type to sh 28983 1726883137.89913: variable 'ansible_shell_executable' from source: unknown 28983 1726883137.89917: variable 'ansible_connection' from source: unknown 28983 1726883137.89919: variable 'ansible_module_compression' from source: unknown 28983 1726883137.89924: variable 'ansible_shell_type' from source: unknown 28983 1726883137.89926: variable 'ansible_shell_executable' from source: unknown 28983 1726883137.89931: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883137.89937: variable 'ansible_pipelining' from source: unknown 28983 1726883137.89941: variable 'ansible_timeout' from source: unknown 28983 1726883137.89949: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883137.90069: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883137.90085: variable 'omit' from source: magic vars 28983 1726883137.90091: starting attempt loop 28983 1726883137.90094: running the handler 28983 1726883137.90223: variable 'interface_stat' from source: set_fact 28983 1726883137.90232: Evaluated conditional (not interface_stat.stat.exists): True 28983 1726883137.90241: handler run complete 28983 1726883137.90254: attempt loop complete, returning result 28983 1726883137.90257: _execute() done 28983 1726883137.90260: dumping result to json 28983 1726883137.90265: done dumping result, returning 28983 1726883137.90275: done running TaskExecutor() for managed_node2/TASK: Assert that the interface is absent - 'statebr' [0affe814-3a2d-b16d-c0a7-0000000028d4] 28983 1726883137.90281: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000028d4 28983 1726883137.90374: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000028d4 28983 1726883137.90377: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 28983 1726883137.90433: no more pending results, returning what we have 28983 1726883137.90439: results queue empty 28983 1726883137.90440: checking for any_errors_fatal 28983 1726883137.90454: done checking for any_errors_fatal 28983 1726883137.90455: checking for max_fail_percentage 28983 1726883137.90457: done checking for max_fail_percentage 28983 1726883137.90458: checking to see if all hosts have failed and the running result is not ok 28983 1726883137.90459: done checking to see if all hosts have failed 28983 1726883137.90460: getting the remaining hosts for this loop 28983 1726883137.90462: done getting the remaining hosts for this loop 28983 1726883137.90467: getting the next task for host managed_node2 28983 1726883137.90478: done getting next task for host managed_node2 28983 1726883137.90482: ^ task is: TASK: Success in test '{{ lsr_description }}' 28983 1726883137.90485: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883137.90490: getting variables 28983 1726883137.90491: in VariableManager get_vars() 28983 1726883137.90549: Calling all_inventory to load vars for managed_node2 28983 1726883137.90552: Calling groups_inventory to load vars for managed_node2 28983 1726883137.90556: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883137.90567: Calling all_plugins_play to load vars for managed_node2 28983 1726883137.90571: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883137.90575: Calling groups_plugins_play to load vars for managed_node2 28983 1726883137.91873: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883137.93583: done with get_vars() 28983 1726883137.93608: done getting variables 28983 1726883137.93660: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 28983 1726883137.93758: variable 'lsr_description' from source: include params TASK [Success in test 'I will not get an error when I try to remove an absent profile'] *** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:47 Friday 20 September 2024 21:45:37 -0400 (0:00:00.054) 0:02:47.935 ****** 28983 1726883137.93786: entering _queue_task() for managed_node2/debug 28983 1726883137.94071: worker is 1 (out of 1 available) 28983 1726883137.94087: exiting _queue_task() for managed_node2/debug 28983 1726883137.94100: done queuing things up, now waiting for results queue to drain 28983 1726883137.94102: waiting for pending results... 28983 1726883137.94318: running TaskExecutor() for managed_node2/TASK: Success in test 'I will not get an error when I try to remove an absent profile' 28983 1726883137.94412: in run() - task 0affe814-3a2d-b16d-c0a7-0000000020b4 28983 1726883137.94424: variable 'ansible_search_path' from source: unknown 28983 1726883137.94429: variable 'ansible_search_path' from source: unknown 28983 1726883137.94466: calling self._execute() 28983 1726883137.94558: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883137.94565: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883137.94579: variable 'omit' from source: magic vars 28983 1726883137.94910: variable 'ansible_distribution_major_version' from source: facts 28983 1726883137.94921: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883137.94929: variable 'omit' from source: magic vars 28983 1726883137.94963: variable 'omit' from source: magic vars 28983 1726883137.95051: variable 'lsr_description' from source: include params 28983 1726883137.95067: variable 'omit' from source: magic vars 28983 1726883137.95108: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883137.95146: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883137.95163: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883137.95182: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883137.95195: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883137.95223: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883137.95227: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883137.95231: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883137.95319: Set connection var ansible_connection to ssh 28983 1726883137.95330: Set connection var ansible_shell_executable to /bin/sh 28983 1726883137.95341: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883137.95362: Set connection var ansible_timeout to 10 28983 1726883137.95365: Set connection var ansible_pipelining to False 28983 1726883137.95368: Set connection var ansible_shell_type to sh 28983 1726883137.95384: variable 'ansible_shell_executable' from source: unknown 28983 1726883137.95387: variable 'ansible_connection' from source: unknown 28983 1726883137.95390: variable 'ansible_module_compression' from source: unknown 28983 1726883137.95394: variable 'ansible_shell_type' from source: unknown 28983 1726883137.95397: variable 'ansible_shell_executable' from source: unknown 28983 1726883137.95402: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883137.95409: variable 'ansible_pipelining' from source: unknown 28983 1726883137.95412: variable 'ansible_timeout' from source: unknown 28983 1726883137.95414: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883137.95539: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883137.95551: variable 'omit' from source: magic vars 28983 1726883137.95557: starting attempt loop 28983 1726883137.95560: running the handler 28983 1726883137.95608: handler run complete 28983 1726883137.95621: attempt loop complete, returning result 28983 1726883137.95625: _execute() done 28983 1726883137.95628: dumping result to json 28983 1726883137.95630: done dumping result, returning 28983 1726883137.95644: done running TaskExecutor() for managed_node2/TASK: Success in test 'I will not get an error when I try to remove an absent profile' [0affe814-3a2d-b16d-c0a7-0000000020b4] 28983 1726883137.95649: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000020b4 ok: [managed_node2] => {} MSG: +++++ Success in test 'I will not get an error when I try to remove an absent profile' +++++ 28983 1726883137.95796: no more pending results, returning what we have 28983 1726883137.95800: results queue empty 28983 1726883137.95801: checking for any_errors_fatal 28983 1726883137.95811: done checking for any_errors_fatal 28983 1726883137.95812: checking for max_fail_percentage 28983 1726883137.95814: done checking for max_fail_percentage 28983 1726883137.95815: checking to see if all hosts have failed and the running result is not ok 28983 1726883137.95816: done checking to see if all hosts have failed 28983 1726883137.95817: getting the remaining hosts for this loop 28983 1726883137.95819: done getting the remaining hosts for this loop 28983 1726883137.95824: getting the next task for host managed_node2 28983 1726883137.95836: done getting next task for host managed_node2 28983 1726883137.95840: ^ task is: TASK: Cleanup 28983 1726883137.95844: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883137.95852: getting variables 28983 1726883137.95853: in VariableManager get_vars() 28983 1726883137.95902: Calling all_inventory to load vars for managed_node2 28983 1726883137.95906: Calling groups_inventory to load vars for managed_node2 28983 1726883137.95909: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883137.95920: Calling all_plugins_play to load vars for managed_node2 28983 1726883137.95924: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883137.95927: Calling groups_plugins_play to load vars for managed_node2 28983 1726883137.95946: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000020b4 28983 1726883137.95949: WORKER PROCESS EXITING 28983 1726883137.97210: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883137.98805: done with get_vars() 28983 1726883137.98827: done getting variables TASK [Cleanup] ***************************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:66 Friday 20 September 2024 21:45:37 -0400 (0:00:00.051) 0:02:47.986 ****** 28983 1726883137.98906: entering _queue_task() for managed_node2/include_tasks 28983 1726883137.99152: worker is 1 (out of 1 available) 28983 1726883137.99166: exiting _queue_task() for managed_node2/include_tasks 28983 1726883137.99182: done queuing things up, now waiting for results queue to drain 28983 1726883137.99184: waiting for pending results... 28983 1726883137.99386: running TaskExecutor() for managed_node2/TASK: Cleanup 28983 1726883137.99474: in run() - task 0affe814-3a2d-b16d-c0a7-0000000020b8 28983 1726883137.99490: variable 'ansible_search_path' from source: unknown 28983 1726883137.99493: variable 'ansible_search_path' from source: unknown 28983 1726883137.99533: variable 'lsr_cleanup' from source: include params 28983 1726883137.99717: variable 'lsr_cleanup' from source: include params 28983 1726883137.99783: variable 'omit' from source: magic vars 28983 1726883137.99902: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883137.99912: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883137.99923: variable 'omit' from source: magic vars 28983 1726883138.00132: variable 'ansible_distribution_major_version' from source: facts 28983 1726883138.00142: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883138.00150: variable 'item' from source: unknown 28983 1726883138.00208: variable 'item' from source: unknown 28983 1726883138.00237: variable 'item' from source: unknown 28983 1726883138.00303: variable 'item' from source: unknown 28983 1726883138.00428: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883138.00432: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883138.00436: variable 'omit' from source: magic vars 28983 1726883138.00558: variable 'ansible_distribution_major_version' from source: facts 28983 1726883138.00562: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883138.00566: variable 'item' from source: unknown 28983 1726883138.00614: variable 'item' from source: unknown 28983 1726883138.00640: variable 'item' from source: unknown 28983 1726883138.00693: variable 'item' from source: unknown 28983 1726883138.00765: dumping result to json 28983 1726883138.00769: done dumping result, returning 28983 1726883138.00775: done running TaskExecutor() for managed_node2/TASK: Cleanup [0affe814-3a2d-b16d-c0a7-0000000020b8] 28983 1726883138.00777: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000020b8 28983 1726883138.00815: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000020b8 28983 1726883138.00818: WORKER PROCESS EXITING 28983 1726883138.00852: no more pending results, returning what we have 28983 1726883138.00858: in VariableManager get_vars() 28983 1726883138.00917: Calling all_inventory to load vars for managed_node2 28983 1726883138.00920: Calling groups_inventory to load vars for managed_node2 28983 1726883138.00924: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883138.00938: Calling all_plugins_play to load vars for managed_node2 28983 1726883138.00942: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883138.00946: Calling groups_plugins_play to load vars for managed_node2 28983 1726883138.02363: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883138.03925: done with get_vars() 28983 1726883138.03946: variable 'ansible_search_path' from source: unknown 28983 1726883138.03947: variable 'ansible_search_path' from source: unknown 28983 1726883138.03981: variable 'ansible_search_path' from source: unknown 28983 1726883138.03982: variable 'ansible_search_path' from source: unknown 28983 1726883138.04003: we have included files to process 28983 1726883138.04004: generating all_blocks data 28983 1726883138.04006: done generating all_blocks data 28983 1726883138.04010: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 28983 1726883138.04010: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 28983 1726883138.04012: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 28983 1726883138.04165: done processing included file 28983 1726883138.04167: iterating over new_blocks loaded from include file 28983 1726883138.04168: in VariableManager get_vars() 28983 1726883138.04184: done with get_vars() 28983 1726883138.04186: filtering new block on tags 28983 1726883138.04206: done filtering new block on tags 28983 1726883138.04208: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml for managed_node2 => (item=tasks/cleanup_profile+device.yml) 28983 1726883138.04211: processing included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 28983 1726883138.04212: loading included file: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 28983 1726883138.04215: Loading data from /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 28983 1726883138.04497: done processing included file 28983 1726883138.04498: iterating over new_blocks loaded from include file 28983 1726883138.04499: in VariableManager get_vars() 28983 1726883138.04515: done with get_vars() 28983 1726883138.04516: filtering new block on tags 28983 1726883138.04542: done filtering new block on tags 28983 1726883138.04543: done iterating over new_blocks loaded from include file included: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed_node2 => (item=tasks/check_network_dns.yml) 28983 1726883138.04546: extending task lists for all hosts with included blocks 28983 1726883138.06198: done extending task lists 28983 1726883138.06200: done processing included files 28983 1726883138.06201: results queue empty 28983 1726883138.06202: checking for any_errors_fatal 28983 1726883138.06205: done checking for any_errors_fatal 28983 1726883138.06206: checking for max_fail_percentage 28983 1726883138.06208: done checking for max_fail_percentage 28983 1726883138.06209: checking to see if all hosts have failed and the running result is not ok 28983 1726883138.06210: done checking to see if all hosts have failed 28983 1726883138.06211: getting the remaining hosts for this loop 28983 1726883138.06212: done getting the remaining hosts for this loop 28983 1726883138.06216: getting the next task for host managed_node2 28983 1726883138.06221: done getting next task for host managed_node2 28983 1726883138.06224: ^ task is: TASK: Cleanup profile and device 28983 1726883138.06228: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883138.06231: getting variables 28983 1726883138.06232: in VariableManager get_vars() 28983 1726883138.06249: Calling all_inventory to load vars for managed_node2 28983 1726883138.06258: Calling groups_inventory to load vars for managed_node2 28983 1726883138.06263: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883138.06269: Calling all_plugins_play to load vars for managed_node2 28983 1726883138.06273: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883138.06277: Calling groups_plugins_play to load vars for managed_node2 28983 1726883138.07686: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883138.09314: done with get_vars() 28983 1726883138.09338: done getting variables 28983 1726883138.09370: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Cleanup profile and device] ********************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml:3 Friday 20 September 2024 21:45:38 -0400 (0:00:00.104) 0:02:48.091 ****** 28983 1726883138.09395: entering _queue_task() for managed_node2/shell 28983 1726883138.09647: worker is 1 (out of 1 available) 28983 1726883138.09663: exiting _queue_task() for managed_node2/shell 28983 1726883138.09677: done queuing things up, now waiting for results queue to drain 28983 1726883138.09679: waiting for pending results... 28983 1726883138.09880: running TaskExecutor() for managed_node2/TASK: Cleanup profile and device 28983 1726883138.09975: in run() - task 0affe814-3a2d-b16d-c0a7-00000000299e 28983 1726883138.09991: variable 'ansible_search_path' from source: unknown 28983 1726883138.09995: variable 'ansible_search_path' from source: unknown 28983 1726883138.10028: calling self._execute() 28983 1726883138.10119: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883138.10124: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883138.10139: variable 'omit' from source: magic vars 28983 1726883138.10467: variable 'ansible_distribution_major_version' from source: facts 28983 1726883138.10480: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883138.10488: variable 'omit' from source: magic vars 28983 1726883138.10527: variable 'omit' from source: magic vars 28983 1726883138.10655: variable 'interface' from source: play vars 28983 1726883138.10673: variable 'omit' from source: magic vars 28983 1726883138.10713: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883138.10746: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883138.10766: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883138.10787: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883138.10798: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883138.10825: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883138.10828: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883138.10835: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883138.10920: Set connection var ansible_connection to ssh 28983 1726883138.10931: Set connection var ansible_shell_executable to /bin/sh 28983 1726883138.10942: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883138.10951: Set connection var ansible_timeout to 10 28983 1726883138.10957: Set connection var ansible_pipelining to False 28983 1726883138.10959: Set connection var ansible_shell_type to sh 28983 1726883138.10981: variable 'ansible_shell_executable' from source: unknown 28983 1726883138.10985: variable 'ansible_connection' from source: unknown 28983 1726883138.10987: variable 'ansible_module_compression' from source: unknown 28983 1726883138.10990: variable 'ansible_shell_type' from source: unknown 28983 1726883138.10994: variable 'ansible_shell_executable' from source: unknown 28983 1726883138.10998: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883138.11003: variable 'ansible_pipelining' from source: unknown 28983 1726883138.11012: variable 'ansible_timeout' from source: unknown 28983 1726883138.11015: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883138.11130: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883138.11143: variable 'omit' from source: magic vars 28983 1726883138.11148: starting attempt loop 28983 1726883138.11151: running the handler 28983 1726883138.11162: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883138.11184: _low_level_execute_command(): starting 28983 1726883138.11192: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726883138.11745: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883138.11749: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883138.11752: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883138.11754: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883138.11815: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883138.11823: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883138.11825: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883138.11899: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883138.13668: stdout chunk (state=3): >>>/root <<< 28983 1726883138.13779: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883138.13830: stderr chunk (state=3): >>><<< 28983 1726883138.13835: stdout chunk (state=3): >>><<< 28983 1726883138.13858: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883138.13871: _low_level_execute_command(): starting 28983 1726883138.13879: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726883138.138571-34795-141567129978402 `" && echo ansible-tmp-1726883138.138571-34795-141567129978402="` echo /root/.ansible/tmp/ansible-tmp-1726883138.138571-34795-141567129978402 `" ) && sleep 0' 28983 1726883138.14314: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883138.14317: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726883138.14327: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883138.14329: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883138.14390: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883138.14393: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883138.14463: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883138.16465: stdout chunk (state=3): >>>ansible-tmp-1726883138.138571-34795-141567129978402=/root/.ansible/tmp/ansible-tmp-1726883138.138571-34795-141567129978402 <<< 28983 1726883138.16569: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883138.16624: stderr chunk (state=3): >>><<< 28983 1726883138.16628: stdout chunk (state=3): >>><<< 28983 1726883138.16651: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726883138.138571-34795-141567129978402=/root/.ansible/tmp/ansible-tmp-1726883138.138571-34795-141567129978402 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883138.16679: variable 'ansible_module_compression' from source: unknown 28983 1726883138.16725: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 28983 1726883138.16762: variable 'ansible_facts' from source: unknown 28983 1726883138.16828: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726883138.138571-34795-141567129978402/AnsiballZ_command.py 28983 1726883138.16951: Sending initial data 28983 1726883138.16954: Sent initial data (155 bytes) 28983 1726883138.17442: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883138.17446: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883138.17448: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883138.17451: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found <<< 28983 1726883138.17455: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883138.17504: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883138.17508: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883138.17583: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883138.19204: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 <<< 28983 1726883138.19207: stderr chunk (state=3): >>>debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726883138.19271: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726883138.19339: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmp82ao21t0 /root/.ansible/tmp/ansible-tmp-1726883138.138571-34795-141567129978402/AnsiballZ_command.py <<< 28983 1726883138.19342: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726883138.138571-34795-141567129978402/AnsiballZ_command.py" <<< 28983 1726883138.19404: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmp82ao21t0" to remote "/root/.ansible/tmp/ansible-tmp-1726883138.138571-34795-141567129978402/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726883138.138571-34795-141567129978402/AnsiballZ_command.py" <<< 28983 1726883138.20308: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883138.20369: stderr chunk (state=3): >>><<< 28983 1726883138.20373: stdout chunk (state=3): >>><<< 28983 1726883138.20399: done transferring module to remote 28983 1726883138.20409: _low_level_execute_command(): starting 28983 1726883138.20415: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726883138.138571-34795-141567129978402/ /root/.ansible/tmp/ansible-tmp-1726883138.138571-34795-141567129978402/AnsiballZ_command.py && sleep 0' 28983 1726883138.20874: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883138.20878: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883138.20881: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883138.20884: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found <<< 28983 1726883138.20889: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883138.20935: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883138.20945: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883138.21011: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883138.22908: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883138.22952: stderr chunk (state=3): >>><<< 28983 1726883138.22955: stdout chunk (state=3): >>><<< 28983 1726883138.22969: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883138.22972: _low_level_execute_command(): starting 28983 1726883138.22980: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726883138.138571-34795-141567129978402/AnsiballZ_command.py && sleep 0' 28983 1726883138.23424: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883138.23427: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883138.23430: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration <<< 28983 1726883138.23432: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found <<< 28983 1726883138.23436: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883138.23488: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883138.23492: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883138.23569: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883138.44835: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "Error: unknown connection 'statebr'.\nError: cannot delete unknown connection(s): 'statebr'.\nCould not load file '/etc/sysconfig/network-scripts/ifcfg-statebr'\nCannot find device \"statebr\"", "rc": 1, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "start": "2024-09-20 21:45:38.410732", "end": "2024-09-20 21:45:38.447121", "delta": "0:00:00.036389", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 28983 1726883138.46545: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.46.139 closed. <<< 28983 1726883138.46612: stderr chunk (state=3): >>><<< 28983 1726883138.46616: stdout chunk (state=3): >>><<< 28983 1726883138.46638: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "Error: unknown connection 'statebr'.\nError: cannot delete unknown connection(s): 'statebr'.\nCould not load file '/etc/sysconfig/network-scripts/ifcfg-statebr'\nCannot find device \"statebr\"", "rc": 1, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "start": "2024-09-20 21:45:38.410732", "end": "2024-09-20 21:45:38.447121", "delta": "0:00:00.036389", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.46.139 closed. 28983 1726883138.46677: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726883138.138571-34795-141567129978402/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726883138.46687: _low_level_execute_command(): starting 28983 1726883138.46695: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726883138.138571-34795-141567129978402/ > /dev/null 2>&1 && sleep 0' 28983 1726883138.47191: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883138.47194: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883138.47197: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration <<< 28983 1726883138.47199: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883138.47201: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883138.47263: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883138.47266: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883138.47331: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883138.49315: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883138.49363: stderr chunk (state=3): >>><<< 28983 1726883138.49367: stdout chunk (state=3): >>><<< 28983 1726883138.49381: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883138.49391: handler run complete 28983 1726883138.49414: Evaluated conditional (False): False 28983 1726883138.49424: attempt loop complete, returning result 28983 1726883138.49427: _execute() done 28983 1726883138.49432: dumping result to json 28983 1726883138.49441: done dumping result, returning 28983 1726883138.49450: done running TaskExecutor() for managed_node2/TASK: Cleanup profile and device [0affe814-3a2d-b16d-c0a7-00000000299e] 28983 1726883138.49454: sending task result for task 0affe814-3a2d-b16d-c0a7-00000000299e 28983 1726883138.49568: done sending task result for task 0affe814-3a2d-b16d-c0a7-00000000299e 28983 1726883138.49571: WORKER PROCESS EXITING fatal: [managed_node2]: FAILED! => { "changed": false, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "delta": "0:00:00.036389", "end": "2024-09-20 21:45:38.447121", "rc": 1, "start": "2024-09-20 21:45:38.410732" } STDERR: Error: unknown connection 'statebr'. Error: cannot delete unknown connection(s): 'statebr'. Could not load file '/etc/sysconfig/network-scripts/ifcfg-statebr' Cannot find device "statebr" MSG: non-zero return code ...ignoring 28983 1726883138.49655: no more pending results, returning what we have 28983 1726883138.49661: results queue empty 28983 1726883138.49662: checking for any_errors_fatal 28983 1726883138.49664: done checking for any_errors_fatal 28983 1726883138.49665: checking for max_fail_percentage 28983 1726883138.49667: done checking for max_fail_percentage 28983 1726883138.49669: checking to see if all hosts have failed and the running result is not ok 28983 1726883138.49669: done checking to see if all hosts have failed 28983 1726883138.49670: getting the remaining hosts for this loop 28983 1726883138.49675: done getting the remaining hosts for this loop 28983 1726883138.49680: getting the next task for host managed_node2 28983 1726883138.49700: done getting next task for host managed_node2 28983 1726883138.49703: ^ task is: TASK: Check routes and DNS 28983 1726883138.49708: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883138.49712: getting variables 28983 1726883138.49713: in VariableManager get_vars() 28983 1726883138.49763: Calling all_inventory to load vars for managed_node2 28983 1726883138.49766: Calling groups_inventory to load vars for managed_node2 28983 1726883138.49770: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883138.49784: Calling all_plugins_play to load vars for managed_node2 28983 1726883138.49787: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883138.49798: Calling groups_plugins_play to load vars for managed_node2 28983 1726883138.51117: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883138.52727: done with get_vars() 28983 1726883138.52754: done getting variables 28983 1726883138.52806: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Friday 20 September 2024 21:45:38 -0400 (0:00:00.434) 0:02:48.526 ****** 28983 1726883138.52833: entering _queue_task() for managed_node2/shell 28983 1726883138.53112: worker is 1 (out of 1 available) 28983 1726883138.53127: exiting _queue_task() for managed_node2/shell 28983 1726883138.53142: done queuing things up, now waiting for results queue to drain 28983 1726883138.53144: waiting for pending results... 28983 1726883138.53355: running TaskExecutor() for managed_node2/TASK: Check routes and DNS 28983 1726883138.53452: in run() - task 0affe814-3a2d-b16d-c0a7-0000000029a2 28983 1726883138.53468: variable 'ansible_search_path' from source: unknown 28983 1726883138.53474: variable 'ansible_search_path' from source: unknown 28983 1726883138.53506: calling self._execute() 28983 1726883138.53595: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883138.53607: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883138.53619: variable 'omit' from source: magic vars 28983 1726883138.53962: variable 'ansible_distribution_major_version' from source: facts 28983 1726883138.53974: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883138.53983: variable 'omit' from source: magic vars 28983 1726883138.54018: variable 'omit' from source: magic vars 28983 1726883138.54053: variable 'omit' from source: magic vars 28983 1726883138.54093: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 28983 1726883138.54124: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 28983 1726883138.54152: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 28983 1726883138.54165: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883138.54178: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 28983 1726883138.54207: variable 'inventory_hostname' from source: host vars for 'managed_node2' 28983 1726883138.54210: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883138.54215: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883138.54306: Set connection var ansible_connection to ssh 28983 1726883138.54316: Set connection var ansible_shell_executable to /bin/sh 28983 1726883138.54325: Set connection var ansible_module_compression to ZIP_DEFLATED 28983 1726883138.54335: Set connection var ansible_timeout to 10 28983 1726883138.54341: Set connection var ansible_pipelining to False 28983 1726883138.54344: Set connection var ansible_shell_type to sh 28983 1726883138.54367: variable 'ansible_shell_executable' from source: unknown 28983 1726883138.54371: variable 'ansible_connection' from source: unknown 28983 1726883138.54374: variable 'ansible_module_compression' from source: unknown 28983 1726883138.54382: variable 'ansible_shell_type' from source: unknown 28983 1726883138.54384: variable 'ansible_shell_executable' from source: unknown 28983 1726883138.54387: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883138.54392: variable 'ansible_pipelining' from source: unknown 28983 1726883138.54396: variable 'ansible_timeout' from source: unknown 28983 1726883138.54401: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883138.54525: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883138.54537: variable 'omit' from source: magic vars 28983 1726883138.54544: starting attempt loop 28983 1726883138.54546: running the handler 28983 1726883138.54558: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 28983 1726883138.54581: _low_level_execute_command(): starting 28983 1726883138.54589: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 28983 1726883138.55138: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883138.55142: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883138.55146: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found <<< 28983 1726883138.55150: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883138.55199: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883138.55203: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883138.55289: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883138.57092: stdout chunk (state=3): >>>/root <<< 28983 1726883138.57197: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883138.57255: stderr chunk (state=3): >>><<< 28983 1726883138.57259: stdout chunk (state=3): >>><<< 28983 1726883138.57284: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883138.57299: _low_level_execute_command(): starting 28983 1726883138.57305: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726883138.5728471-34803-39451072104161 `" && echo ansible-tmp-1726883138.5728471-34803-39451072104161="` echo /root/.ansible/tmp/ansible-tmp-1726883138.5728471-34803-39451072104161 `" ) && sleep 0' 28983 1726883138.57767: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883138.57770: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883138.57781: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883138.57783: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883138.57832: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883138.57841: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883138.57916: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883138.60024: stdout chunk (state=3): >>>ansible-tmp-1726883138.5728471-34803-39451072104161=/root/.ansible/tmp/ansible-tmp-1726883138.5728471-34803-39451072104161 <<< 28983 1726883138.60138: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883138.60191: stderr chunk (state=3): >>><<< 28983 1726883138.60194: stdout chunk (state=3): >>><<< 28983 1726883138.60209: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726883138.5728471-34803-39451072104161=/root/.ansible/tmp/ansible-tmp-1726883138.5728471-34803-39451072104161 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883138.60238: variable 'ansible_module_compression' from source: unknown 28983 1726883138.60288: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-28983n8o6vhn0/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 28983 1726883138.60318: variable 'ansible_facts' from source: unknown 28983 1726883138.60386: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726883138.5728471-34803-39451072104161/AnsiballZ_command.py 28983 1726883138.60496: Sending initial data 28983 1726883138.60499: Sent initial data (155 bytes) 28983 1726883138.60965: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883138.60968: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726883138.60971: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration <<< 28983 1726883138.60977: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883138.61030: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883138.61036: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883138.61109: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883138.62827: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 28983 1726883138.62832: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 28983 1726883138.62894: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 28983 1726883138.62961: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpn41hmp63 /root/.ansible/tmp/ansible-tmp-1726883138.5728471-34803-39451072104161/AnsiballZ_command.py <<< 28983 1726883138.62971: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726883138.5728471-34803-39451072104161/AnsiballZ_command.py" <<< 28983 1726883138.63028: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-28983n8o6vhn0/tmpn41hmp63" to remote "/root/.ansible/tmp/ansible-tmp-1726883138.5728471-34803-39451072104161/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726883138.5728471-34803-39451072104161/AnsiballZ_command.py" <<< 28983 1726883138.63935: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883138.63993: stderr chunk (state=3): >>><<< 28983 1726883138.63997: stdout chunk (state=3): >>><<< 28983 1726883138.64015: done transferring module to remote 28983 1726883138.64027: _low_level_execute_command(): starting 28983 1726883138.64033: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726883138.5728471-34803-39451072104161/ /root/.ansible/tmp/ansible-tmp-1726883138.5728471-34803-39451072104161/AnsiballZ_command.py && sleep 0' 28983 1726883138.64494: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883138.64497: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726883138.64499: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883138.64502: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found <<< 28983 1726883138.64507: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883138.64556: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883138.64560: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883138.64638: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883138.66631: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883138.66683: stderr chunk (state=3): >>><<< 28983 1726883138.66686: stdout chunk (state=3): >>><<< 28983 1726883138.66700: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883138.66704: _low_level_execute_command(): starting 28983 1726883138.66710: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726883138.5728471-34803-39451072104161/AnsiballZ_command.py && sleep 0' 28983 1726883138.67167: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883138.67170: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found <<< 28983 1726883138.67172: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration <<< 28983 1726883138.67175: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found <<< 28983 1726883138.67177: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883138.67230: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 28983 1726883138.67239: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883138.67317: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883138.86002: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 02:98:65:d3:42:6b brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.46.139/22 brd 10.31.47.255 scope global dynamic noprefixroute eth0\n valid_lft 2736sec preferred_lft 2736sec\n inet6 fe80::2f3a:b84:7c06:1e06/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.44.1 dev eth0 proto dhcp src 10.31.46.139 metric 100 \n10.31.44.0/22 dev eth0 proto kernel scope link src 10.31.46.139 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# This is /run/systemd/resolve/stub-resolv.conf managed by man:systemd-resolved(8).\n# Do not edit.\n#\n# This file might be symlinked as /etc/resolv.conf. If you're looking at\n# /etc/resolv.conf and seeing this text, you have followed the symlink.\n#\n# This is a dynamic resolv.conf file for connecting local clients to the\n# internal DNS stub resolver of systemd-resolved. This file lists all\n# configured search domains.\n#\n# Run \"resolvectl status\" to see details about the uplink DNS servers\n# currently in use.\n#\n# Third party programs should typically not access this file directly, but only\n# through the symlink at /etc/resolv.conf. To manage man:resolv.conf(5) in a\n# different way, replace this symlink by a static file or a different symlink.\n#\n# See man:systemd-resolved.service(8) for details about the supported modes of\n# operation for /etc/resolv.conf.\n\nnameserver 127.0.0.53\noptions edns0 trust-ad\nsearch us-east-1.aws.redhat.com", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 21:45:38.849372", "end": "2024-09-20 21:45:38.858629", "delta": "0:00:00.009257", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 28983 1726883138.87778: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. <<< 28983 1726883138.87836: stderr chunk (state=3): >>><<< 28983 1726883138.87840: stdout chunk (state=3): >>><<< 28983 1726883138.87857: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 02:98:65:d3:42:6b brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.46.139/22 brd 10.31.47.255 scope global dynamic noprefixroute eth0\n valid_lft 2736sec preferred_lft 2736sec\n inet6 fe80::2f3a:b84:7c06:1e06/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.44.1 dev eth0 proto dhcp src 10.31.46.139 metric 100 \n10.31.44.0/22 dev eth0 proto kernel scope link src 10.31.46.139 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# This is /run/systemd/resolve/stub-resolv.conf managed by man:systemd-resolved(8).\n# Do not edit.\n#\n# This file might be symlinked as /etc/resolv.conf. If you're looking at\n# /etc/resolv.conf and seeing this text, you have followed the symlink.\n#\n# This is a dynamic resolv.conf file for connecting local clients to the\n# internal DNS stub resolver of systemd-resolved. This file lists all\n# configured search domains.\n#\n# Run \"resolvectl status\" to see details about the uplink DNS servers\n# currently in use.\n#\n# Third party programs should typically not access this file directly, but only\n# through the symlink at /etc/resolv.conf. To manage man:resolv.conf(5) in a\n# different way, replace this symlink by a static file or a different symlink.\n#\n# See man:systemd-resolved.service(8) for details about the supported modes of\n# operation for /etc/resolv.conf.\n\nnameserver 127.0.0.53\noptions edns0 trust-ad\nsearch us-east-1.aws.redhat.com", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 21:45:38.849372", "end": "2024-09-20 21:45:38.858629", "delta": "0:00:00.009257", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.46.139 closed. 28983 1726883138.87911: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726883138.5728471-34803-39451072104161/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 28983 1726883138.87920: _low_level_execute_command(): starting 28983 1726883138.87926: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726883138.5728471-34803-39451072104161/ > /dev/null 2>&1 && sleep 0' 28983 1726883138.88402: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 28983 1726883138.88405: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 <<< 28983 1726883138.88408: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883138.88410: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 28983 1726883138.88412: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 28983 1726883138.88467: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 28983 1726883138.88475: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 28983 1726883138.88547: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 28983 1726883138.90535: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 28983 1726883138.90581: stderr chunk (state=3): >>><<< 28983 1726883138.90589: stdout chunk (state=3): >>><<< 28983 1726883138.90602: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.46.139 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.46.139 originally 10.31.46.139 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 28983 1726883138.90610: handler run complete 28983 1726883138.90636: Evaluated conditional (False): False 28983 1726883138.90646: attempt loop complete, returning result 28983 1726883138.90649: _execute() done 28983 1726883138.90654: dumping result to json 28983 1726883138.90661: done dumping result, returning 28983 1726883138.90669: done running TaskExecutor() for managed_node2/TASK: Check routes and DNS [0affe814-3a2d-b16d-c0a7-0000000029a2] 28983 1726883138.90677: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000029a2 28983 1726883138.90804: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000029a2 28983 1726883138.90807: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.009257", "end": "2024-09-20 21:45:38.858629", "rc": 0, "start": "2024-09-20 21:45:38.849372" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host noprefixroute valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 02:98:65:d3:42:6b brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.46.139/22 brd 10.31.47.255 scope global dynamic noprefixroute eth0 valid_lft 2736sec preferred_lft 2736sec inet6 fe80::2f3a:b84:7c06:1e06/64 scope link noprefixroute valid_lft forever preferred_lft forever IP ROUTE default via 10.31.44.1 dev eth0 proto dhcp src 10.31.46.139 metric 100 10.31.44.0/22 dev eth0 proto kernel scope link src 10.31.46.139 metric 100 IP -6 ROUTE fe80::/64 dev eth0 proto kernel metric 1024 pref medium RESOLV # This is /run/systemd/resolve/stub-resolv.conf managed by man:systemd-resolved(8). # Do not edit. # # This file might be symlinked as /etc/resolv.conf. If you're looking at # /etc/resolv.conf and seeing this text, you have followed the symlink. # # This is a dynamic resolv.conf file for connecting local clients to the # internal DNS stub resolver of systemd-resolved. This file lists all # configured search domains. # # Run "resolvectl status" to see details about the uplink DNS servers # currently in use. # # Third party programs should typically not access this file directly, but only # through the symlink at /etc/resolv.conf. To manage man:resolv.conf(5) in a # different way, replace this symlink by a static file or a different symlink. # # See man:systemd-resolved.service(8) for details about the supported modes of # operation for /etc/resolv.conf. nameserver 127.0.0.53 options edns0 trust-ad search us-east-1.aws.redhat.com 28983 1726883138.90899: no more pending results, returning what we have 28983 1726883138.90903: results queue empty 28983 1726883138.90904: checking for any_errors_fatal 28983 1726883138.90922: done checking for any_errors_fatal 28983 1726883138.90923: checking for max_fail_percentage 28983 1726883138.90925: done checking for max_fail_percentage 28983 1726883138.90926: checking to see if all hosts have failed and the running result is not ok 28983 1726883138.90927: done checking to see if all hosts have failed 28983 1726883138.90928: getting the remaining hosts for this loop 28983 1726883138.90930: done getting the remaining hosts for this loop 28983 1726883138.90936: getting the next task for host managed_node2 28983 1726883138.90945: done getting next task for host managed_node2 28983 1726883138.90948: ^ task is: TASK: Verify DNS and network connectivity 28983 1726883138.90956: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883138.90960: getting variables 28983 1726883138.90962: in VariableManager get_vars() 28983 1726883138.91007: Calling all_inventory to load vars for managed_node2 28983 1726883138.91011: Calling groups_inventory to load vars for managed_node2 28983 1726883138.91015: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883138.91026: Calling all_plugins_play to load vars for managed_node2 28983 1726883138.91029: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883138.91033: Calling groups_plugins_play to load vars for managed_node2 28983 1726883138.96755: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883138.98316: done with get_vars() 28983 1726883138.98344: done getting variables 28983 1726883138.98386: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Friday 20 September 2024 21:45:38 -0400 (0:00:00.455) 0:02:48.981 ****** 28983 1726883138.98408: entering _queue_task() for managed_node2/shell 28983 1726883138.98698: worker is 1 (out of 1 available) 28983 1726883138.98713: exiting _queue_task() for managed_node2/shell 28983 1726883138.98727: done queuing things up, now waiting for results queue to drain 28983 1726883138.98731: waiting for pending results... 28983 1726883138.98932: running TaskExecutor() for managed_node2/TASK: Verify DNS and network connectivity 28983 1726883138.99046: in run() - task 0affe814-3a2d-b16d-c0a7-0000000029a3 28983 1726883138.99059: variable 'ansible_search_path' from source: unknown 28983 1726883138.99065: variable 'ansible_search_path' from source: unknown 28983 1726883138.99101: calling self._execute() 28983 1726883138.99199: variable 'ansible_host' from source: host vars for 'managed_node2' 28983 1726883138.99206: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 28983 1726883138.99218: variable 'omit' from source: magic vars 28983 1726883138.99554: variable 'ansible_distribution_major_version' from source: facts 28983 1726883138.99565: Evaluated conditional (ansible_distribution_major_version != '6'): True 28983 1726883138.99694: variable 'ansible_facts' from source: unknown 28983 1726883139.00419: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): False 28983 1726883139.00424: when evaluation is False, skipping this task 28983 1726883139.00428: _execute() done 28983 1726883139.00430: dumping result to json 28983 1726883139.00436: done dumping result, returning 28983 1726883139.00445: done running TaskExecutor() for managed_node2/TASK: Verify DNS and network connectivity [0affe814-3a2d-b16d-c0a7-0000000029a3] 28983 1726883139.00451: sending task result for task 0affe814-3a2d-b16d-c0a7-0000000029a3 28983 1726883139.00556: done sending task result for task 0affe814-3a2d-b16d-c0a7-0000000029a3 28983 1726883139.00559: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_facts[\"distribution\"] == \"CentOS\"", "skip_reason": "Conditional result was False" } 28983 1726883139.00612: no more pending results, returning what we have 28983 1726883139.00616: results queue empty 28983 1726883139.00617: checking for any_errors_fatal 28983 1726883139.00643: done checking for any_errors_fatal 28983 1726883139.00644: checking for max_fail_percentage 28983 1726883139.00646: done checking for max_fail_percentage 28983 1726883139.00648: checking to see if all hosts have failed and the running result is not ok 28983 1726883139.00648: done checking to see if all hosts have failed 28983 1726883139.00649: getting the remaining hosts for this loop 28983 1726883139.00651: done getting the remaining hosts for this loop 28983 1726883139.00657: getting the next task for host managed_node2 28983 1726883139.00669: done getting next task for host managed_node2 28983 1726883139.00672: ^ task is: TASK: meta (flush_handlers) 28983 1726883139.00675: ^ state is: HOST STATE: block=9, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883139.00679: getting variables 28983 1726883139.00681: in VariableManager get_vars() 28983 1726883139.00725: Calling all_inventory to load vars for managed_node2 28983 1726883139.00728: Calling groups_inventory to load vars for managed_node2 28983 1726883139.00732: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883139.00753: Calling all_plugins_play to load vars for managed_node2 28983 1726883139.00757: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883139.00761: Calling groups_plugins_play to load vars for managed_node2 28983 1726883139.02010: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883139.03716: done with get_vars() 28983 1726883139.03740: done getting variables 28983 1726883139.03799: in VariableManager get_vars() 28983 1726883139.03810: Calling all_inventory to load vars for managed_node2 28983 1726883139.03811: Calling groups_inventory to load vars for managed_node2 28983 1726883139.03813: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883139.03817: Calling all_plugins_play to load vars for managed_node2 28983 1726883139.03819: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883139.03821: Calling groups_plugins_play to load vars for managed_node2 28983 1726883139.04926: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883139.06541: done with get_vars() 28983 1726883139.06567: done queuing things up, now waiting for results queue to drain 28983 1726883139.06569: results queue empty 28983 1726883139.06570: checking for any_errors_fatal 28983 1726883139.06572: done checking for any_errors_fatal 28983 1726883139.06573: checking for max_fail_percentage 28983 1726883139.06574: done checking for max_fail_percentage 28983 1726883139.06575: checking to see if all hosts have failed and the running result is not ok 28983 1726883139.06575: done checking to see if all hosts have failed 28983 1726883139.06576: getting the remaining hosts for this loop 28983 1726883139.06577: done getting the remaining hosts for this loop 28983 1726883139.06579: getting the next task for host managed_node2 28983 1726883139.06582: done getting next task for host managed_node2 28983 1726883139.06583: ^ task is: TASK: meta (flush_handlers) 28983 1726883139.06585: ^ state is: HOST STATE: block=10, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883139.06587: getting variables 28983 1726883139.06588: in VariableManager get_vars() 28983 1726883139.06599: Calling all_inventory to load vars for managed_node2 28983 1726883139.06602: Calling groups_inventory to load vars for managed_node2 28983 1726883139.06604: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883139.06609: Calling all_plugins_play to load vars for managed_node2 28983 1726883139.06610: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883139.06613: Calling groups_plugins_play to load vars for managed_node2 28983 1726883139.08267: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883139.10079: done with get_vars() 28983 1726883139.10103: done getting variables 28983 1726883139.10145: in VariableManager get_vars() 28983 1726883139.10155: Calling all_inventory to load vars for managed_node2 28983 1726883139.10157: Calling groups_inventory to load vars for managed_node2 28983 1726883139.10159: Calling all_plugins_inventory to load vars for managed_node2 28983 1726883139.10162: Calling all_plugins_play to load vars for managed_node2 28983 1726883139.10164: Calling groups_plugins_inventory to load vars for managed_node2 28983 1726883139.10166: Calling groups_plugins_play to load vars for managed_node2 28983 1726883139.11247: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 28983 1726883139.12926: done with get_vars() 28983 1726883139.12953: done queuing things up, now waiting for results queue to drain 28983 1726883139.12955: results queue empty 28983 1726883139.12956: checking for any_errors_fatal 28983 1726883139.12957: done checking for any_errors_fatal 28983 1726883139.12957: checking for max_fail_percentage 28983 1726883139.12958: done checking for max_fail_percentage 28983 1726883139.12959: checking to see if all hosts have failed and the running result is not ok 28983 1726883139.12959: done checking to see if all hosts have failed 28983 1726883139.12960: getting the remaining hosts for this loop 28983 1726883139.12960: done getting the remaining hosts for this loop 28983 1726883139.12967: getting the next task for host managed_node2 28983 1726883139.12970: done getting next task for host managed_node2 28983 1726883139.12971: ^ task is: None 28983 1726883139.12972: ^ state is: HOST STATE: block=11, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 28983 1726883139.12974: done queuing things up, now waiting for results queue to drain 28983 1726883139.12975: results queue empty 28983 1726883139.12975: checking for any_errors_fatal 28983 1726883139.12976: done checking for any_errors_fatal 28983 1726883139.12976: checking for max_fail_percentage 28983 1726883139.12977: done checking for max_fail_percentage 28983 1726883139.12977: checking to see if all hosts have failed and the running result is not ok 28983 1726883139.12978: done checking to see if all hosts have failed 28983 1726883139.12980: getting the next task for host managed_node2 28983 1726883139.12982: done getting next task for host managed_node2 28983 1726883139.12982: ^ task is: None 28983 1726883139.12983: ^ state is: HOST STATE: block=11, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed_node2 : ok=333 changed=10 unreachable=0 failed=0 skipped=313 rescued=0 ignored=9 Friday 20 September 2024 21:45:39 -0400 (0:00:00.146) 0:02:49.128 ****** =============================================================================== fedora.linux_system_roles.network : Check which services are running ---- 3.34s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 2.49s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 2.44s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 2.43s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 2.37s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 2.36s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 2.36s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 2.31s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 2.31s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 2.30s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 2.30s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 2.29s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 2.27s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 2.23s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 2.23s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which packages are installed --- 2.20s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Gathering Facts --------------------------------------------------------- 1.53s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/tests/network/tests_states_nm.yml:6 fedora.linux_system_roles.network : Check which packages are installed --- 1.39s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Check which packages are installed --- 1.34s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Check which packages are installed --- 1.33s /tmp/collections-4FB/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 28983 1726883139.13232: RUNNING CLEANUP